Jan 31 09:25:02 crc systemd[1]: Starting Kubernetes Kubelet... Jan 31 09:25:02 crc restorecon[4760]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:02 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:25:03 crc restorecon[4760]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:25:03 crc restorecon[4760]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 31 09:25:04 crc kubenswrapper[4992]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 09:25:04 crc kubenswrapper[4992]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 31 09:25:04 crc kubenswrapper[4992]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 09:25:04 crc kubenswrapper[4992]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 09:25:04 crc kubenswrapper[4992]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 31 09:25:04 crc kubenswrapper[4992]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.918069 4992 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925665 4992 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925698 4992 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925709 4992 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925718 4992 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925727 4992 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925736 4992 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925747 4992 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925760 4992 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925770 4992 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925800 4992 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925809 4992 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925818 4992 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925828 4992 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925837 4992 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925846 4992 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925854 4992 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925864 4992 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925875 4992 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925887 4992 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925897 4992 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925907 4992 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925916 4992 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925925 4992 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925934 4992 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925943 4992 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925951 4992 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925963 4992 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925972 4992 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925984 4992 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.925994 4992 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926004 4992 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926013 4992 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926022 4992 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926031 4992 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926041 4992 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926051 4992 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926060 4992 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926070 4992 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926080 4992 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926089 4992 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926100 4992 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926109 4992 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926117 4992 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926125 4992 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926134 4992 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926144 4992 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926153 4992 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926161 4992 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926169 4992 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926177 4992 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926185 4992 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926194 4992 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926202 4992 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926210 4992 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926218 4992 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926226 4992 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926234 4992 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926243 4992 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926251 4992 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926260 4992 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926268 4992 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926276 4992 feature_gate.go:330] unrecognized feature gate: Example Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926285 4992 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926293 4992 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926301 4992 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926310 4992 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926318 4992 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926326 4992 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926334 4992 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926342 4992 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.926350 4992 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927295 4992 flags.go:64] FLAG: --address="0.0.0.0" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927324 4992 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927343 4992 flags.go:64] FLAG: --anonymous-auth="true" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927355 4992 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927368 4992 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927379 4992 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927390 4992 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927402 4992 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927412 4992 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927452 4992 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927462 4992 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927472 4992 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927482 4992 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927491 4992 flags.go:64] FLAG: --cgroup-root="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927501 4992 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927511 4992 flags.go:64] FLAG: --client-ca-file="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927520 4992 flags.go:64] FLAG: --cloud-config="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927529 4992 flags.go:64] FLAG: --cloud-provider="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927538 4992 flags.go:64] FLAG: --cluster-dns="[]" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927549 4992 flags.go:64] FLAG: --cluster-domain="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927559 4992 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927569 4992 flags.go:64] FLAG: --config-dir="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927578 4992 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927588 4992 flags.go:64] FLAG: --container-log-max-files="5" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927600 4992 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927610 4992 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927620 4992 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927630 4992 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927640 4992 flags.go:64] FLAG: --contention-profiling="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927649 4992 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927659 4992 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927669 4992 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927679 4992 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927690 4992 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927700 4992 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927710 4992 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927720 4992 flags.go:64] FLAG: --enable-load-reader="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927730 4992 flags.go:64] FLAG: --enable-server="true" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927740 4992 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927763 4992 flags.go:64] FLAG: --event-burst="100" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927772 4992 flags.go:64] FLAG: --event-qps="50" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927783 4992 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927792 4992 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927802 4992 flags.go:64] FLAG: --eviction-hard="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927823 4992 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927832 4992 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927842 4992 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927852 4992 flags.go:64] FLAG: --eviction-soft="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927862 4992 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927871 4992 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927880 4992 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927891 4992 flags.go:64] FLAG: --experimental-mounter-path="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927900 4992 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927910 4992 flags.go:64] FLAG: --fail-swap-on="true" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927919 4992 flags.go:64] FLAG: --feature-gates="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927930 4992 flags.go:64] FLAG: --file-check-frequency="20s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927941 4992 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927950 4992 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927960 4992 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927970 4992 flags.go:64] FLAG: --healthz-port="10248" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927980 4992 flags.go:64] FLAG: --help="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.927991 4992 flags.go:64] FLAG: --hostname-override="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928000 4992 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928010 4992 flags.go:64] FLAG: --http-check-frequency="20s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928020 4992 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928029 4992 flags.go:64] FLAG: --image-credential-provider-config="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928038 4992 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928048 4992 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928058 4992 flags.go:64] FLAG: --image-service-endpoint="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928067 4992 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928077 4992 flags.go:64] FLAG: --kube-api-burst="100" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928086 4992 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928097 4992 flags.go:64] FLAG: --kube-api-qps="50" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928107 4992 flags.go:64] FLAG: --kube-reserved="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928124 4992 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928134 4992 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928144 4992 flags.go:64] FLAG: --kubelet-cgroups="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928153 4992 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928162 4992 flags.go:64] FLAG: --lock-file="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928172 4992 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928181 4992 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928191 4992 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928206 4992 flags.go:64] FLAG: --log-json-split-stream="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928215 4992 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928224 4992 flags.go:64] FLAG: --log-text-split-stream="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928234 4992 flags.go:64] FLAG: --logging-format="text" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928243 4992 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928253 4992 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928263 4992 flags.go:64] FLAG: --manifest-url="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928272 4992 flags.go:64] FLAG: --manifest-url-header="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928284 4992 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928294 4992 flags.go:64] FLAG: --max-open-files="1000000" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928306 4992 flags.go:64] FLAG: --max-pods="110" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928316 4992 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928326 4992 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928336 4992 flags.go:64] FLAG: --memory-manager-policy="None" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928346 4992 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928358 4992 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928368 4992 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928378 4992 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928398 4992 flags.go:64] FLAG: --node-status-max-images="50" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928407 4992 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928469 4992 flags.go:64] FLAG: --oom-score-adj="-999" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928481 4992 flags.go:64] FLAG: --pod-cidr="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928490 4992 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928503 4992 flags.go:64] FLAG: --pod-manifest-path="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928512 4992 flags.go:64] FLAG: --pod-max-pids="-1" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928522 4992 flags.go:64] FLAG: --pods-per-core="0" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928532 4992 flags.go:64] FLAG: --port="10250" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928543 4992 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928553 4992 flags.go:64] FLAG: --provider-id="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928562 4992 flags.go:64] FLAG: --qos-reserved="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928580 4992 flags.go:64] FLAG: --read-only-port="10255" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928592 4992 flags.go:64] FLAG: --register-node="true" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928604 4992 flags.go:64] FLAG: --register-schedulable="true" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928617 4992 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928638 4992 flags.go:64] FLAG: --registry-burst="10" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928651 4992 flags.go:64] FLAG: --registry-qps="5" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928666 4992 flags.go:64] FLAG: --reserved-cpus="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928677 4992 flags.go:64] FLAG: --reserved-memory="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928691 4992 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928703 4992 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928714 4992 flags.go:64] FLAG: --rotate-certificates="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928724 4992 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928734 4992 flags.go:64] FLAG: --runonce="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928743 4992 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928753 4992 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928763 4992 flags.go:64] FLAG: --seccomp-default="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928772 4992 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928782 4992 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928793 4992 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928802 4992 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928813 4992 flags.go:64] FLAG: --storage-driver-password="root" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928825 4992 flags.go:64] FLAG: --storage-driver-secure="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928835 4992 flags.go:64] FLAG: --storage-driver-table="stats" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928845 4992 flags.go:64] FLAG: --storage-driver-user="root" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928854 4992 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928864 4992 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928873 4992 flags.go:64] FLAG: --system-cgroups="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928883 4992 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928899 4992 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928911 4992 flags.go:64] FLAG: --tls-cert-file="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928922 4992 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928937 4992 flags.go:64] FLAG: --tls-min-version="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928954 4992 flags.go:64] FLAG: --tls-private-key-file="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928969 4992 flags.go:64] FLAG: --topology-manager-policy="none" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928981 4992 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.928993 4992 flags.go:64] FLAG: --topology-manager-scope="container" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.929005 4992 flags.go:64] FLAG: --v="2" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.929018 4992 flags.go:64] FLAG: --version="false" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.929031 4992 flags.go:64] FLAG: --vmodule="" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.929042 4992 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.929052 4992 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929293 4992 feature_gate.go:330] unrecognized feature gate: Example Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929305 4992 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929315 4992 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929324 4992 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929334 4992 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929345 4992 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929356 4992 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929365 4992 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929374 4992 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929383 4992 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929392 4992 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929400 4992 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929411 4992 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929457 4992 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929469 4992 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929480 4992 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929490 4992 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929500 4992 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929511 4992 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929521 4992 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929533 4992 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929543 4992 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929559 4992 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929578 4992 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929589 4992 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929599 4992 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929610 4992 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929620 4992 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929631 4992 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929640 4992 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929649 4992 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929657 4992 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929666 4992 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929675 4992 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929686 4992 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929696 4992 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929706 4992 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929717 4992 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929727 4992 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929736 4992 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929745 4992 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929757 4992 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929768 4992 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929777 4992 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929786 4992 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929795 4992 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929803 4992 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929812 4992 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929820 4992 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929829 4992 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929837 4992 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929845 4992 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929854 4992 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929862 4992 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929871 4992 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929882 4992 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929890 4992 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929898 4992 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929907 4992 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929916 4992 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929924 4992 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929932 4992 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929941 4992 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929951 4992 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929963 4992 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929977 4992 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.929989 4992 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.930000 4992 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.930011 4992 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.930020 4992 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.930030 4992 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.930059 4992 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.943877 4992 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.943896 4992 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.943967 4992 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.943976 4992 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.943982 4992 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.943986 4992 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.943990 4992 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.943994 4992 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.943998 4992 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944001 4992 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944005 4992 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944009 4992 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944012 4992 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944015 4992 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944019 4992 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944022 4992 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944026 4992 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944030 4992 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944035 4992 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944038 4992 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944042 4992 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944046 4992 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944049 4992 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944053 4992 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944056 4992 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944060 4992 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944064 4992 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944067 4992 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944070 4992 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944074 4992 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944077 4992 feature_gate.go:330] unrecognized feature gate: Example Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944081 4992 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944084 4992 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944088 4992 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944092 4992 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944096 4992 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944101 4992 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944105 4992 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944109 4992 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944113 4992 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944116 4992 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944120 4992 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944124 4992 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944127 4992 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944131 4992 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944134 4992 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944139 4992 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944144 4992 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944149 4992 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944154 4992 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944158 4992 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944162 4992 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944165 4992 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944169 4992 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944173 4992 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944177 4992 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944180 4992 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944185 4992 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944189 4992 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944193 4992 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944197 4992 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944200 4992 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944204 4992 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944208 4992 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944212 4992 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944215 4992 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944219 4992 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944224 4992 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944227 4992 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944231 4992 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944234 4992 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944238 4992 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944242 4992 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.944248 4992 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944349 4992 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944355 4992 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944359 4992 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944362 4992 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944366 4992 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944369 4992 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944373 4992 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944377 4992 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944380 4992 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944384 4992 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944388 4992 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944391 4992 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944395 4992 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944398 4992 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944402 4992 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944405 4992 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944408 4992 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944451 4992 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944455 4992 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944459 4992 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944463 4992 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944466 4992 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944469 4992 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944473 4992 feature_gate.go:330] unrecognized feature gate: Example Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944477 4992 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944481 4992 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944486 4992 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944490 4992 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944494 4992 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944498 4992 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944502 4992 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944506 4992 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944510 4992 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944514 4992 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944518 4992 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944522 4992 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944525 4992 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944529 4992 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944532 4992 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944536 4992 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944539 4992 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944543 4992 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944547 4992 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944550 4992 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944553 4992 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944557 4992 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944561 4992 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944564 4992 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944568 4992 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944571 4992 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944576 4992 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944581 4992 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944585 4992 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944589 4992 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944593 4992 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944597 4992 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944601 4992 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944604 4992 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944608 4992 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944613 4992 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944617 4992 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944621 4992 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944625 4992 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944628 4992 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944632 4992 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944636 4992 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944641 4992 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944644 4992 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944648 4992 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944652 4992 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 09:25:04 crc kubenswrapper[4992]: W0131 09:25:04.944657 4992 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.944663 4992 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.945741 4992 server.go:940] "Client rotation is on, will bootstrap in background" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.951523 4992 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.951636 4992 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.956158 4992 server.go:997] "Starting client certificate rotation" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.956207 4992 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.956452 4992 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-13 00:17:42.428582313 +0000 UTC Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.956581 4992 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.987779 4992 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 31 09:25:04 crc kubenswrapper[4992]: E0131 09:25:04.991278 4992 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.243:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:25:04 crc kubenswrapper[4992]: I0131 09:25:04.995462 4992 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.011135 4992 log.go:25] "Validated CRI v1 runtime API" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.076898 4992 log.go:25] "Validated CRI v1 image API" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.079037 4992 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.087391 4992 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-31-09-20-03-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.087446 4992 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.100761 4992 manager.go:217] Machine: {Timestamp:2026-01-31 09:25:05.098847739 +0000 UTC m=+1.070239736 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a568f6a4-7307-4080-940d-10f688be5b04 BootID:dd4112a1-95ba-4903-8139-c099442066c8 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:a7:5f:f5 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:a7:5f:f5 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:03:f0:39 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:e4:c8:2c Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:27:e3:24 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:a7:e1:76 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:93:21:c3 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4e:4c:e8:92:f7:c9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:0a:a2:e2:0b:5e:d3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.100957 4992 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.101057 4992 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.101515 4992 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.101703 4992 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.101741 4992 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.101967 4992 topology_manager.go:138] "Creating topology manager with none policy" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.101980 4992 container_manager_linux.go:303] "Creating device plugin manager" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.102914 4992 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.102953 4992 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.103184 4992 state_mem.go:36] "Initialized new in-memory state store" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.103278 4992 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.107688 4992 kubelet.go:418] "Attempting to sync node with API server" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.107718 4992 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.107740 4992 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.107752 4992 kubelet.go:324] "Adding apiserver pod source" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.107762 4992 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.112530 4992 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.113655 4992 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 31 09:25:05 crc kubenswrapper[4992]: W0131 09:25:05.115635 4992 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.115744 4992 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.243:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:25:05 crc kubenswrapper[4992]: W0131 09:25:05.115676 4992 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.115836 4992 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.243:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.119684 4992 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.122056 4992 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.122085 4992 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.122094 4992 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.122103 4992 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.122116 4992 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.122124 4992 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.122133 4992 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.122146 4992 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.122156 4992 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.122165 4992 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.122178 4992 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.122186 4992 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.123944 4992 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.124396 4992 server.go:1280] "Started kubelet" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.124536 4992 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.125785 4992 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 31 09:25:05 crc systemd[1]: Started Kubernetes Kubelet. Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.126343 4992 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.126708 4992 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.131001 4992 server.go:460] "Adding debug handlers to kubelet server" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.132094 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.132545 4992 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.132666 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 18:13:53.398274829 +0000 UTC Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.132958 4992 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.133167 4992 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.135157 4992 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 31 09:25:05 crc kubenswrapper[4992]: W0131 09:25:05.134817 4992 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.136113 4992 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.243:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.134494 4992 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.132806 4992 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.243:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188fc6911c20e182 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 09:25:05.12436877 +0000 UTC m=+1.095760747,LastTimestamp:2026-01-31 09:25:05.12436877 +0000 UTC m=+1.095760747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.136340 4992 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" interval="200ms" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.138927 4992 factory.go:55] Registering systemd factory Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.138957 4992 factory.go:221] Registration of the systemd container factory successfully Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.140904 4992 factory.go:153] Registering CRI-O factory Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.140925 4992 factory.go:221] Registration of the crio container factory successfully Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.140988 4992 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.141012 4992 factory.go:103] Registering Raw factory Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.141027 4992 manager.go:1196] Started watching for new ooms in manager Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.142657 4992 manager.go:319] Starting recovery of all containers Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150459 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150520 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150538 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150559 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150575 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150590 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150607 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150619 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150639 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150652 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150663 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150682 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150694 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150716 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150729 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150748 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150760 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150772 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150782 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150796 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150808 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150819 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150837 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150852 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150865 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150883 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150900 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150919 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150938 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150954 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150966 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.150986 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151003 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151018 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151030 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151044 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151061 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151076 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151133 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151149 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151162 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151178 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151192 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151207 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151223 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151238 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151255 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151268 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151282 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151921 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.151967 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152281 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152312 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152329 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152353 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152373 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152388 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152406 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152441 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152456 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152474 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152488 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152537 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152550 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152564 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152583 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152597 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152893 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152913 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152932 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152948 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.152962 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.153321 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155366 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155454 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155475 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155492 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155570 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155592 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155637 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155656 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155680 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155697 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155714 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155733 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155750 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155797 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155813 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155826 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155839 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155854 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155866 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155877 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155889 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155902 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155934 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155964 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155975 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155987 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.155999 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156010 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156023 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156035 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156048 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156069 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156085 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156142 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156157 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156171 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156185 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156198 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156210 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156224 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156237 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156250 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156264 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156278 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156290 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156304 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156317 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156331 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156345 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156359 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156374 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156386 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156400 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156412 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156441 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156453 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156467 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156480 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156494 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156508 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156521 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156534 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156547 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156559 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156572 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156585 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156602 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156614 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156626 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156637 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156649 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156660 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156671 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156683 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156695 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156706 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156718 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156734 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156747 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156760 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156772 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156784 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156797 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156809 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156822 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156835 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156846 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.156859 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161391 4992 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161447 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161469 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161482 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161494 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161508 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161521 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161533 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161546 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161558 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161574 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161592 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161609 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161625 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161638 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161649 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161659 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161670 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161679 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161689 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161698 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161710 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161719 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161733 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161743 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161754 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161764 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161777 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161786 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161797 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161807 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161817 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161829 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161841 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161856 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161868 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161887 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161906 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161918 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161935 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161947 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161958 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161969 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161979 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.161996 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.162008 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.162020 4992 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.162031 4992 reconstruct.go:97] "Volume reconstruction finished" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.162038 4992 reconciler.go:26] "Reconciler: start to sync state" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.163067 4992 manager.go:324] Recovery completed Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.173729 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.175809 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.175854 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.175865 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.176530 4992 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.176545 4992 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.176569 4992 state_mem.go:36] "Initialized new in-memory state store" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.179686 4992 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.181268 4992 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.181322 4992 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.181354 4992 kubelet.go:2335] "Starting kubelet main sync loop" Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.181396 4992 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 31 09:25:05 crc kubenswrapper[4992]: W0131 09:25:05.182139 4992 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.182211 4992 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.243:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.197379 4992 policy_none.go:49] "None policy: Start" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.198537 4992 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.198589 4992 state_mem.go:35] "Initializing new in-memory state store" Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.233065 4992 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.254501 4992 manager.go:334] "Starting Device Plugin manager" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.254541 4992 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.254552 4992 server.go:79] "Starting device plugin registration server" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.254868 4992 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.254884 4992 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.255024 4992 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.255089 4992 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.255100 4992 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.262316 4992 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.282089 4992 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.282294 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.283771 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.283804 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.283813 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.283935 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.284550 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.284695 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.284979 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.285003 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.285011 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.285352 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.285392 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.285479 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.286256 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.286306 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.286321 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.286466 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.286498 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.286509 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.286602 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.286628 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.286668 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.286728 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.286547 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.286744 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.287940 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.287961 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.287970 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.288187 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.288241 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.288259 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.288641 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.288871 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.288952 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.290434 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.290475 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.290490 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.290661 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.290705 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.290746 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.290768 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.290776 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.291535 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.291624 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.291643 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.338280 4992 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" interval="400ms" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.355091 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.356578 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.356631 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.356648 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.356682 4992 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.357275 4992 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.243:6443: connect: connection refused" node="crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.363758 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.363830 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.363849 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.363863 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.363882 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.363899 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.363964 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.364023 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.364052 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.364077 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.364100 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.364119 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.364140 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.364172 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.364198 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465172 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465226 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465247 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465261 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465278 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465295 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465310 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465322 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465337 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465351 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465374 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465389 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465403 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465408 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465453 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465454 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465483 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465508 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465512 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465502 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465525 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465543 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465445 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465513 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465487 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465565 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465583 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465594 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465597 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.465406 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.557937 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.559367 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.559468 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.559483 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.559565 4992 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.560159 4992 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.243:6443: connect: connection refused" node="crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.624798 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.632992 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.661653 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.674104 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.681002 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:05 crc kubenswrapper[4992]: W0131 09:25:05.690750 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-34eb3943e4bc6152ed928f9a698c275a5eeaa8280bcc36a81b0382ba3eca4fbf WatchSource:0}: Error finding container 34eb3943e4bc6152ed928f9a698c275a5eeaa8280bcc36a81b0382ba3eca4fbf: Status 404 returned error can't find the container with id 34eb3943e4bc6152ed928f9a698c275a5eeaa8280bcc36a81b0382ba3eca4fbf Jan 31 09:25:05 crc kubenswrapper[4992]: W0131 09:25:05.700529 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-d59b2fb1c27c08ff3b3622792b6e65c8fb8844943e663dd1109a5f786c5c160e WatchSource:0}: Error finding container d59b2fb1c27c08ff3b3622792b6e65c8fb8844943e663dd1109a5f786c5c160e: Status 404 returned error can't find the container with id d59b2fb1c27c08ff3b3622792b6e65c8fb8844943e663dd1109a5f786c5c160e Jan 31 09:25:05 crc kubenswrapper[4992]: W0131 09:25:05.704516 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d53ece7f9e202c0e20a011c47544ec8d402a4395fed076905282881556c4564b WatchSource:0}: Error finding container d53ece7f9e202c0e20a011c47544ec8d402a4395fed076905282881556c4564b: Status 404 returned error can't find the container with id d53ece7f9e202c0e20a011c47544ec8d402a4395fed076905282881556c4564b Jan 31 09:25:05 crc kubenswrapper[4992]: W0131 09:25:05.706531 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e01eb5cbeec0b7bdfd66391cd207838bd844dbe5b8cc9eb3c80c8eb7831d68c0 WatchSource:0}: Error finding container e01eb5cbeec0b7bdfd66391cd207838bd844dbe5b8cc9eb3c80c8eb7831d68c0: Status 404 returned error can't find the container with id e01eb5cbeec0b7bdfd66391cd207838bd844dbe5b8cc9eb3c80c8eb7831d68c0 Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.739656 4992 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" interval="800ms" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.961155 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.962400 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.962458 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.962468 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:05 crc kubenswrapper[4992]: I0131 09:25:05.962490 4992 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.963015 4992 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.243:6443: connect: connection refused" node="crc" Jan 31 09:25:05 crc kubenswrapper[4992]: W0131 09:25:05.997871 4992 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:05 crc kubenswrapper[4992]: E0131 09:25:05.998024 4992 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.243:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:25:06 crc kubenswrapper[4992]: W0131 09:25:06.056709 4992 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:06 crc kubenswrapper[4992]: E0131 09:25:06.056803 4992 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.243:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:25:06 crc kubenswrapper[4992]: I0131 09:25:06.127755 4992 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:06 crc kubenswrapper[4992]: I0131 09:25:06.132902 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 04:33:30.375548739 +0000 UTC Jan 31 09:25:06 crc kubenswrapper[4992]: I0131 09:25:06.186054 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d59b2fb1c27c08ff3b3622792b6e65c8fb8844943e663dd1109a5f786c5c160e"} Jan 31 09:25:06 crc kubenswrapper[4992]: I0131 09:25:06.187263 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"26abc425f889064499eda25d5620d70ff18f1006cf16a499f8864130b29079d6"} Jan 31 09:25:06 crc kubenswrapper[4992]: I0131 09:25:06.188623 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"34eb3943e4bc6152ed928f9a698c275a5eeaa8280bcc36a81b0382ba3eca4fbf"} Jan 31 09:25:06 crc kubenswrapper[4992]: I0131 09:25:06.190140 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e01eb5cbeec0b7bdfd66391cd207838bd844dbe5b8cc9eb3c80c8eb7831d68c0"} Jan 31 09:25:06 crc kubenswrapper[4992]: I0131 09:25:06.191151 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d53ece7f9e202c0e20a011c47544ec8d402a4395fed076905282881556c4564b"} Jan 31 09:25:06 crc kubenswrapper[4992]: W0131 09:25:06.257857 4992 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:06 crc kubenswrapper[4992]: E0131 09:25:06.257940 4992 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.243:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:25:06 crc kubenswrapper[4992]: W0131 09:25:06.424195 4992 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:06 crc kubenswrapper[4992]: E0131 09:25:06.424585 4992 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.243:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:25:06 crc kubenswrapper[4992]: E0131 09:25:06.540304 4992 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" interval="1.6s" Jan 31 09:25:06 crc kubenswrapper[4992]: I0131 09:25:06.764007 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:06 crc kubenswrapper[4992]: I0131 09:25:06.765811 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:06 crc kubenswrapper[4992]: I0131 09:25:06.765862 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:06 crc kubenswrapper[4992]: I0131 09:25:06.765877 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:06 crc kubenswrapper[4992]: I0131 09:25:06.765910 4992 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:25:06 crc kubenswrapper[4992]: E0131 09:25:06.766479 4992 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.243:6443: connect: connection refused" node="crc" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.127515 4992 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.133772 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 17:24:58.504093797 +0000 UTC Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.140463 4992 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 31 09:25:07 crc kubenswrapper[4992]: E0131 09:25:07.141707 4992 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.243:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.199190 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0"} Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.199347 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d"} Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.199378 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9"} Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.199407 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255"} Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.202182 4992 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8" exitCode=0 Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.202371 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8"} Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.202362 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.203984 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.204010 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.204020 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.204300 4992 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845" exitCode=0 Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.204455 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845"} Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.204495 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.206024 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.206050 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.206061 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.206195 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.206953 4992 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1" exitCode=0 Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.206992 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1"} Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.207058 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.208783 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.208821 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.208835 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.208902 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.209173 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.209208 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.211760 4992 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5" exitCode=0 Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.211806 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5"} Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.211875 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.216024 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.216084 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:07 crc kubenswrapper[4992]: I0131 09:25:07.216103 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:07 crc kubenswrapper[4992]: W0131 09:25:07.885056 4992 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:07 crc kubenswrapper[4992]: E0131 09:25:07.885142 4992 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.243:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.128084 4992 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.134188 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 10:07:15.380875459 +0000 UTC Jan 31 09:25:08 crc kubenswrapper[4992]: E0131 09:25:08.141533 4992 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" interval="3.2s" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.216456 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7"} Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.216506 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3"} Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.218342 4992 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636" exitCode=0 Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.218388 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636"} Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.218594 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.219781 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b55a9078475b0ae70324c309ff98f1d0c156f8363d66673e308dc65354001590"} Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.219969 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.220790 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.220817 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.220825 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.220871 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.220913 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.220929 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.222681 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b04d5bf660214bf3a60b1f2b2dc1be26fadbc1eca2cb41156bda68db68583f20"} Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.222704 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"02672dc43413e911b9a81cd11509509bfe92fb72dd403eb8b052cdeceffb0537"} Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.222728 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.223611 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.223658 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.223672 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:08 crc kubenswrapper[4992]: W0131 09:25:08.273526 4992 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:08 crc kubenswrapper[4992]: E0131 09:25:08.273605 4992 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.243:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:25:08 crc kubenswrapper[4992]: W0131 09:25:08.358640 4992 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:08 crc kubenswrapper[4992]: E0131 09:25:08.358720 4992 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.243:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.367045 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.368291 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.368344 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.368356 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:08 crc kubenswrapper[4992]: I0131 09:25:08.368384 4992 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:25:08 crc kubenswrapper[4992]: E0131 09:25:08.368858 4992 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.243:6443: connect: connection refused" node="crc" Jan 31 09:25:08 crc kubenswrapper[4992]: W0131 09:25:08.626244 4992 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:08 crc kubenswrapper[4992]: E0131 09:25:08.626379 4992 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.243:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.127976 4992 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.243:6443: connect: connection refused Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.135303 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 06:41:38.821707801 +0000 UTC Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.229283 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665"} Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.229323 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5"} Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.229334 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183"} Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.229386 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.230239 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.230271 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.230284 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.232178 4992 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1" exitCode=0 Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.232242 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1"} Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.232291 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.233125 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.233151 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.233162 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.234562 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0a2c6368831eaa04c16890f0f8cd508e6a25399ea0d7a2f56c16b6210902207c"} Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.234585 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.234572 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.235529 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.235586 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.235602 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.236687 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.236712 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:09 crc kubenswrapper[4992]: I0131 09:25:09.236721 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:09 crc kubenswrapper[4992]: E0131 09:25:09.459969 4992 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.243:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188fc6911c20e182 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 09:25:05.12436877 +0000 UTC m=+1.095760747,LastTimestamp:2026-01-31 09:25:05.12436877 +0000 UTC m=+1.095760747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.136263 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 13:04:55.976088745 +0000 UTC Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.240295 4992 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.240348 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.240665 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596"} Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.240701 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620"} Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.240717 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832"} Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.240729 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760"} Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.240788 4992 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.240818 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.241175 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.241212 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.241227 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.241903 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.241950 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.241988 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.323084 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.323392 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.325378 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.325500 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:10 crc kubenswrapper[4992]: I0131 09:25:10.325530 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.137298 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 16:40:22.717795015 +0000 UTC Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.156595 4992 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.246794 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76"} Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.246910 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.247818 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.247862 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.247878 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.569874 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.571117 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.571170 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.571185 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.571218 4992 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.939508 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.939652 4992 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.939696 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.940800 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.940826 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:11 crc kubenswrapper[4992]: I0131 09:25:11.940837 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.137536 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 02:15:02.81762822 +0000 UTC Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.245024 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.245225 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.247001 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.247054 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.247074 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.249796 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.251028 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.251067 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.251083 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.256287 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.256532 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.258102 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.258446 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.258646 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.519579 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.519890 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.521609 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.521681 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.521711 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.881939 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.882107 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.883187 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.883215 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:12 crc kubenswrapper[4992]: I0131 09:25:12.883226 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:13 crc kubenswrapper[4992]: I0131 09:25:13.138167 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 16:24:30.309178925 +0000 UTC Jan 31 09:25:13 crc kubenswrapper[4992]: I0131 09:25:13.474842 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:13 crc kubenswrapper[4992]: I0131 09:25:13.474975 4992 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:25:13 crc kubenswrapper[4992]: I0131 09:25:13.475019 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:13 crc kubenswrapper[4992]: I0131 09:25:13.476094 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:13 crc kubenswrapper[4992]: I0131 09:25:13.476129 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:13 crc kubenswrapper[4992]: I0131 09:25:13.476141 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:13 crc kubenswrapper[4992]: I0131 09:25:13.871109 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:13 crc kubenswrapper[4992]: I0131 09:25:13.871271 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:13 crc kubenswrapper[4992]: I0131 09:25:13.872394 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:13 crc kubenswrapper[4992]: I0131 09:25:13.872440 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:13 crc kubenswrapper[4992]: I0131 09:25:13.872450 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:14 crc kubenswrapper[4992]: I0131 09:25:14.138967 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 09:15:42.205494388 +0000 UTC Jan 31 09:25:14 crc kubenswrapper[4992]: I0131 09:25:14.213678 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:14 crc kubenswrapper[4992]: I0131 09:25:14.255074 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:14 crc kubenswrapper[4992]: I0131 09:25:14.256252 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:14 crc kubenswrapper[4992]: I0131 09:25:14.256312 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:14 crc kubenswrapper[4992]: I0131 09:25:14.256332 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:15 crc kubenswrapper[4992]: I0131 09:25:15.142069 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 20:54:07.813166343 +0000 UTC Jan 31 09:25:15 crc kubenswrapper[4992]: E0131 09:25:15.262520 4992 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 31 09:25:15 crc kubenswrapper[4992]: I0131 09:25:15.554927 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 31 09:25:15 crc kubenswrapper[4992]: I0131 09:25:15.555242 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:15 crc kubenswrapper[4992]: I0131 09:25:15.556867 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:15 crc kubenswrapper[4992]: I0131 09:25:15.556918 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:15 crc kubenswrapper[4992]: I0131 09:25:15.556935 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:16 crc kubenswrapper[4992]: I0131 09:25:16.353939 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 01:16:53.603760953 +0000 UTC Jan 31 09:25:16 crc kubenswrapper[4992]: I0131 09:25:16.475898 4992 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 09:25:16 crc kubenswrapper[4992]: I0131 09:25:16.476489 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 31 09:25:17 crc kubenswrapper[4992]: I0131 09:25:17.030492 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 31 09:25:17 crc kubenswrapper[4992]: I0131 09:25:17.031040 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:17 crc kubenswrapper[4992]: I0131 09:25:17.032903 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:17 crc kubenswrapper[4992]: I0131 09:25:17.032958 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:17 crc kubenswrapper[4992]: I0131 09:25:17.032975 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:17 crc kubenswrapper[4992]: I0131 09:25:17.354391 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 23:00:46.733698143 +0000 UTC Jan 31 09:25:18 crc kubenswrapper[4992]: I0131 09:25:18.354728 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 21:00:01.838614979 +0000 UTC Jan 31 09:25:19 crc kubenswrapper[4992]: I0131 09:25:19.355559 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 19:54:39.014922854 +0000 UTC Jan 31 09:25:20 crc kubenswrapper[4992]: I0131 09:25:20.028994 4992 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 31 09:25:20 crc kubenswrapper[4992]: I0131 09:25:20.029079 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 31 09:25:20 crc kubenswrapper[4992]: I0131 09:25:20.033473 4992 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 31 09:25:20 crc kubenswrapper[4992]: I0131 09:25:20.033558 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 31 09:25:20 crc kubenswrapper[4992]: I0131 09:25:20.356224 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 15:11:32.736162086 +0000 UTC Jan 31 09:25:21 crc kubenswrapper[4992]: I0131 09:25:21.356708 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 14:15:27.582826094 +0000 UTC Jan 31 09:25:22 crc kubenswrapper[4992]: I0131 09:25:22.356832 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 09:35:16.447782216 +0000 UTC Jan 31 09:25:23 crc kubenswrapper[4992]: I0131 09:25:23.357720 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 08:53:53.161582714 +0000 UTC Jan 31 09:25:23 crc kubenswrapper[4992]: I0131 09:25:23.876108 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:23 crc kubenswrapper[4992]: I0131 09:25:23.877161 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:23 crc kubenswrapper[4992]: I0131 09:25:23.878410 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:23 crc kubenswrapper[4992]: I0131 09:25:23.878508 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:23 crc kubenswrapper[4992]: I0131 09:25:23.878525 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:23 crc kubenswrapper[4992]: I0131 09:25:23.881491 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:24 crc kubenswrapper[4992]: I0131 09:25:24.217309 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:24 crc kubenswrapper[4992]: I0131 09:25:24.217504 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:24 crc kubenswrapper[4992]: I0131 09:25:24.218652 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:24 crc kubenswrapper[4992]: I0131 09:25:24.218713 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:24 crc kubenswrapper[4992]: I0131 09:25:24.218733 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:24 crc kubenswrapper[4992]: I0131 09:25:24.358161 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 11:34:00.724619827 +0000 UTC Jan 31 09:25:24 crc kubenswrapper[4992]: I0131 09:25:24.378341 4992 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:25:24 crc kubenswrapper[4992]: I0131 09:25:24.378398 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:24 crc kubenswrapper[4992]: I0131 09:25:24.379323 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:24 crc kubenswrapper[4992]: I0131 09:25:24.379384 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:24 crc kubenswrapper[4992]: I0131 09:25:24.379407 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.026071 4992 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.027093 4992 trace.go:236] Trace[1189087490]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 09:25:12.618) (total time: 12409ms): Jan 31 09:25:25 crc kubenswrapper[4992]: Trace[1189087490]: ---"Objects listed" error: 12409ms (09:25:25.027) Jan 31 09:25:25 crc kubenswrapper[4992]: Trace[1189087490]: [12.409044114s] [12.409044114s] END Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.027119 4992 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.028272 4992 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.029564 4992 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.029623 4992 trace.go:236] Trace[1093477170]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 09:25:14.421) (total time: 10607ms): Jan 31 09:25:25 crc kubenswrapper[4992]: Trace[1093477170]: ---"Objects listed" error: 10607ms (09:25:25.029) Jan 31 09:25:25 crc kubenswrapper[4992]: Trace[1093477170]: [10.607597862s] [10.607597862s] END Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.029650 4992 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.031683 4992 trace.go:236] Trace[1375754973]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 09:25:12.669) (total time: 12361ms): Jan 31 09:25:25 crc kubenswrapper[4992]: Trace[1375754973]: ---"Objects listed" error: 12361ms (09:25:25.031) Jan 31 09:25:25 crc kubenswrapper[4992]: Trace[1375754973]: [12.361827699s] [12.361827699s] END Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.031714 4992 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.034568 4992 trace.go:236] Trace[1932628140]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 09:25:12.077) (total time: 12956ms): Jan 31 09:25:25 crc kubenswrapper[4992]: Trace[1932628140]: ---"Objects listed" error: 12956ms (09:25:25.034) Jan 31 09:25:25 crc kubenswrapper[4992]: Trace[1932628140]: [12.956900086s] [12.956900086s] END Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.034614 4992 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.035114 4992 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.054844 4992 csr.go:261] certificate signing request csr-9lckn is approved, waiting to be issued Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.058765 4992 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48702->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.058824 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48702->192.168.126.11:17697: read: connection reset by peer" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.058997 4992 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48710->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.059036 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48710->192.168.126.11:17697: read: connection reset by peer" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.059487 4992 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.059600 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.063466 4992 csr.go:257] certificate signing request csr-9lckn is issued Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.358373 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 09:16:36.488914463 +0000 UTC Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.360522 4992 apiserver.go:52] "Watching apiserver" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.363342 4992 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.363540 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.363895 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.363949 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.364079 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.364261 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.364116 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.364088 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.364398 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.364409 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.364634 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.366069 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.366653 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.366871 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.366883 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.366802 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.366740 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.367473 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.367603 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.367879 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.384623 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.386821 4992 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665" exitCode=255 Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.386871 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665"} Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.414440 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-pt7xd"] Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.414928 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pt7xd" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.417516 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.419246 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.419280 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.423337 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.425484 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.426399 4992 scope.go:117] "RemoveContainer" containerID="a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.437617 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.438211 4992 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.469186 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.481461 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.491658 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.505122 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.521482 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.533154 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.533219 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.533259 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.533291 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.533347 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.533379 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.533409 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.533910 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.534164 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.534341 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.534745 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.535186 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.535677 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.535903 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.536396 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.536460 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.536643 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.536753 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.536857 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.537035 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.537144 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.537238 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.537322 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.537435 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.537558 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.537802 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.537867 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.537889 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.537916 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.537945 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.537943 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.537966 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.537927 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.537988 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538014 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538034 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538019 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538055 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538078 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538099 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538118 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538134 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538155 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538175 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538196 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538214 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538233 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538237 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538253 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538277 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538306 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538334 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538357 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538381 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538386 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538411 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538468 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538502 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538654 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538697 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538728 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538759 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538786 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538810 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538840 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538863 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538888 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538917 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538926 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538940 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.538515 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.539054 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.539196 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.539089 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.539397 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.540687 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.539494 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.539775 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.540084 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.540905 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.539271 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.540231 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.540389 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.541131 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.540970 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.541305 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.541324 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.541334 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.541634 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.540509 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542008 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542061 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.536408 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.541339 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542221 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542322 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542368 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542499 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542395 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542598 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542104 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542660 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542698 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542730 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542754 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542775 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542821 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542883 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542902 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543012 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543077 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543094 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543114 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543155 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543177 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543194 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543226 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543305 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543325 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543385 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543408 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543456 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543564 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.544497 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.544534 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.544585 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.544733 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.544769 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.544801 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.544841 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.544867 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.544902 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.545386 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.545803 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.545875 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.545912 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.545956 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.545987 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.546014 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.546070 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.546898 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.546981 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.547029 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542669 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542923 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543022 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543283 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543562 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.543594 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.542359 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.544098 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.544370 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.544563 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.544899 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.545032 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.545216 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.545232 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.545279 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.545450 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.545882 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.549469 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.546141 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.546309 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.546310 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.546330 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.546671 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.546897 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.549823 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.550026 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.550039 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.547270 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.549968 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.547318 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.546188 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.547735 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.547742 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.548142 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.549381 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.549436 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.550353 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.550661 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551449 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551532 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551567 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551648 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551681 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551704 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551727 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551751 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551775 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551795 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551817 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551839 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551860 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551882 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551905 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551927 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551949 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551971 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551991 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.552013 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.552034 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.552056 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.552113 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.552140 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.552904 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.552931 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.552957 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.553026 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.553054 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.553078 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.553100 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.553121 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.550750 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551232 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551302 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551427 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551638 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.550784 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551823 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.551867 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.552011 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.550105 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.552305 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.553567 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.553757 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.554194 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.554236 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.555160 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.555226 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.555577 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.555638 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.555691 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.555766 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.555765 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.555886 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.555953 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.555990 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556009 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556025 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556054 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556072 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556089 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556108 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556127 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556183 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556202 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556233 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556280 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556296 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556314 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556331 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556354 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556371 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556392 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.555953 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.555974 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556123 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556602 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556603 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.556982 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.557231 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.557475 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.557536 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.557567 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.557897 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.557951 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.558158 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.558218 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.558691 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.558851 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.558888 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.558896 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559149 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559206 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559241 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559261 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559280 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559397 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559437 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559401 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559466 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559594 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559719 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559748 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559762 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559793 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559825 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559855 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559882 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559912 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559945 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559972 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.560123 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.559998 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.560287 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.560326 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.560355 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.560383 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.560645 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.560408 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.560924 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561010 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561046 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561111 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561145 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561170 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561194 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561219 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561251 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561278 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561306 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561346 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561455 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561777 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561825 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561854 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561881 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561907 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561937 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9caf126d-53ac-498b-97d4-89c3c435805e-hosts-file\") pod \"node-resolver-pt7xd\" (UID: \"9caf126d-53ac-498b-97d4-89c3c435805e\") " pod="openshift-dns/node-resolver-pt7xd" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561973 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562007 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562176 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562250 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562315 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562369 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562398 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562441 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562468 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcmb2\" (UniqueName: \"kubernetes.io/projected/9caf126d-53ac-498b-97d4-89c3c435805e-kube-api-access-wcmb2\") pod \"node-resolver-pt7xd\" (UID: \"9caf126d-53ac-498b-97d4-89c3c435805e\") " pod="openshift-dns/node-resolver-pt7xd" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562604 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562624 4992 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562638 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562653 4992 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562668 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562683 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.565606 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.560811 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.560930 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561095 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561187 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561219 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561232 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.561873 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562145 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562455 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562461 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562669 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.562688 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.562734 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:25:26.062690012 +0000 UTC m=+22.034082089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.563072 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.563557 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.563721 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.563738 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.563875 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.564091 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.564367 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.564462 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.564542 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.565089 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.565208 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.565218 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.565239 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.565246 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.565557 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.565575 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.565884 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.565984 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.566165 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.566962 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.567008 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.566492 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.567007 4992 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.567016 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.567834 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.569392 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.569746 4992 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.569858 4992 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.569917 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:26.069898961 +0000 UTC m=+22.041290948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.570121 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:26.070099067 +0000 UTC m=+22.041491124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.570475 4992 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.570594 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.570682 4992 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.570755 4992 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.570825 4992 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.570897 4992 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.570973 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.571049 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.571145 4992 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.571238 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.571322 4992 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.571403 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.571667 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.571824 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.571909 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.571989 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.572059 4992 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.572139 4992 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.572351 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.572589 4992 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.572719 4992 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.572796 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.573850 4992 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.573953 4992 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.574022 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.574094 4992 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.574168 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.574237 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.574313 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.574384 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.574996 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.575091 4992 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.575169 4992 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.575239 4992 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.575309 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.575413 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.575555 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.575628 4992 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.575697 4992 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.575770 4992 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.575843 4992 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.576019 4992 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.576108 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.576188 4992 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.576282 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.576363 4992 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.576471 4992 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.576563 4992 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.576635 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.576704 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.576851 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.576958 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.577065 4992 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.577172 4992 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.577247 4992 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.577318 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.577391 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.577535 4992 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.577667 4992 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.577888 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.577990 4992 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.578101 4992 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.578200 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.578294 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.578376 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.578465 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.578554 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.578627 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.578720 4992 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.578800 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.578875 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.578945 4992 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.579012 4992 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.579081 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.579171 4992 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.579250 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.579322 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.579440 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.579604 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.579717 4992 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.579854 4992 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.579946 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580093 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580197 4992 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580333 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580411 4992 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580727 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580756 4992 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580772 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580728 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.576820 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.576882 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.579676 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580809 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580827 4992 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580840 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580853 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580866 4992 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580878 4992 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.579833 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580891 4992 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580924 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580939 4992 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580952 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580963 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580975 4992 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580986 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.580998 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.581012 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.581024 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.581037 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.581050 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.581061 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.581075 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.581086 4992 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.581099 4992 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.581111 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.581121 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.581141 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.581172 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.581185 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.581195 4992 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.581200 4992 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.581244 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:26.081227184 +0000 UTC m=+22.052619241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:25 crc kubenswrapper[4992]: E0131 09:25:25.581266 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:26.081253115 +0000 UTC m=+22.052645182 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.581629 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.581810 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.582224 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.583609 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.583658 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.584471 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.584599 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.584661 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.589348 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.589866 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.589869 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.589999 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.590359 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.590669 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.591749 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.591834 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.596814 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.596904 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.598929 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.600025 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.600188 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.600316 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.600267 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.600202 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.600500 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.600529 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.600917 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.601096 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.601102 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.601155 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.601292 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.604657 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.604776 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.604974 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.604986 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.605330 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.605583 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.605800 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.608733 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.615248 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.618875 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682281 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682328 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcmb2\" (UniqueName: \"kubernetes.io/projected/9caf126d-53ac-498b-97d4-89c3c435805e-kube-api-access-wcmb2\") pod \"node-resolver-pt7xd\" (UID: \"9caf126d-53ac-498b-97d4-89c3c435805e\") " pod="openshift-dns/node-resolver-pt7xd" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682357 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682399 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9caf126d-53ac-498b-97d4-89c3c435805e-hosts-file\") pod \"node-resolver-pt7xd\" (UID: \"9caf126d-53ac-498b-97d4-89c3c435805e\") " pod="openshift-dns/node-resolver-pt7xd" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682449 4992 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682460 4992 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682468 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682476 4992 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682485 4992 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682505 4992 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682512 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682521 4992 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682529 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682536 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682555 4992 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682576 4992 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682584 4992 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682614 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682624 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682631 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682641 4992 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682648 4992 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682679 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682697 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682793 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9caf126d-53ac-498b-97d4-89c3c435805e-hosts-file\") pod \"node-resolver-pt7xd\" (UID: \"9caf126d-53ac-498b-97d4-89c3c435805e\") " pod="openshift-dns/node-resolver-pt7xd" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682848 4992 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682858 4992 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682867 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682874 4992 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682883 4992 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682891 4992 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682899 4992 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682906 4992 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682915 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682923 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682931 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682939 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682947 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682958 4992 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682966 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682975 4992 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.682987 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683000 4992 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683008 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683016 4992 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683023 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683030 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683038 4992 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683047 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683056 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683070 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683078 4992 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683085 4992 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683093 4992 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683101 4992 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683109 4992 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683117 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683125 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683133 4992 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683140 4992 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683153 4992 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683162 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683170 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683182 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683191 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683199 4992 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683209 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683217 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683226 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683234 4992 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683245 4992 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683255 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683266 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683277 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683288 4992 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683298 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683309 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683317 4992 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683331 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683341 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683349 4992 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683356 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683365 4992 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.683373 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.688974 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.700262 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcmb2\" (UniqueName: \"kubernetes.io/projected/9caf126d-53ac-498b-97d4-89c3c435805e-kube-api-access-wcmb2\") pod \"node-resolver-pt7xd\" (UID: \"9caf126d-53ac-498b-97d4-89c3c435805e\") " pod="openshift-dns/node-resolver-pt7xd" Jan 31 09:25:25 crc kubenswrapper[4992]: W0131 09:25:25.702862 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-ad5482b02d03c815c046b62698f9b54c2cd882316d2325c0334ebbf2384b7fe5 WatchSource:0}: Error finding container ad5482b02d03c815c046b62698f9b54c2cd882316d2325c0334ebbf2384b7fe5: Status 404 returned error can't find the container with id ad5482b02d03c815c046b62698f9b54c2cd882316d2325c0334ebbf2384b7fe5 Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.704780 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:25:25 crc kubenswrapper[4992]: W0131 09:25:25.715047 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-9c9abd34f545b4fb158f37313ec49152ab27215f0c37a72c34e186db815803cd WatchSource:0}: Error finding container 9c9abd34f545b4fb158f37313ec49152ab27215f0c37a72c34e186db815803cd: Status 404 returned error can't find the container with id 9c9abd34f545b4fb158f37313ec49152ab27215f0c37a72c34e186db815803cd Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.726806 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pt7xd" Jan 31 09:25:25 crc kubenswrapper[4992]: W0131 09:25:25.747826 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9caf126d_53ac_498b_97d4_89c3c435805e.slice/crio-bc480e687272f3027c6185bf5831d1d7e3e69c7f2ff59e325d717bed652e0fb1 WatchSource:0}: Error finding container bc480e687272f3027c6185bf5831d1d7e3e69c7f2ff59e325d717bed652e0fb1: Status 404 returned error can't find the container with id bc480e687272f3027c6185bf5831d1d7e3e69c7f2ff59e325d717bed652e0fb1 Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.833713 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.842813 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.843757 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.848031 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.864686 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.877469 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.896014 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.918583 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.931168 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.943133 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.956857 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.970494 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.978571 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.982680 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:25 crc kubenswrapper[4992]: I0131 09:25:25.995681 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.006725 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.021752 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.034719 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.045822 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.062380 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.071485 4992 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-31 09:20:25 +0000 UTC, rotation deadline is 2026-11-13 23:43:29.539478299 +0000 UTC Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.071563 4992 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6878h18m3.46791866s for next certificate rotation Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.083630 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.089929 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.090025 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:26 crc kubenswrapper[4992]: E0131 09:25:26.090104 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:25:27.0900664 +0000 UTC m=+23.061458387 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:25:26 crc kubenswrapper[4992]: E0131 09:25:26.090154 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:25:26 crc kubenswrapper[4992]: E0131 09:25:26.090172 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.090180 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.090207 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.090251 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:26 crc kubenswrapper[4992]: E0131 09:25:26.090186 4992 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:26 crc kubenswrapper[4992]: E0131 09:25:26.090341 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:27.090323797 +0000 UTC m=+23.061715794 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:26 crc kubenswrapper[4992]: E0131 09:25:26.090346 4992 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:25:26 crc kubenswrapper[4992]: E0131 09:25:26.090220 4992 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:25:26 crc kubenswrapper[4992]: E0131 09:25:26.090277 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:25:26 crc kubenswrapper[4992]: E0131 09:25:26.090388 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:25:26 crc kubenswrapper[4992]: E0131 09:25:26.090397 4992 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:26 crc kubenswrapper[4992]: E0131 09:25:26.090377 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:27.090370929 +0000 UTC m=+23.061762916 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:25:26 crc kubenswrapper[4992]: E0131 09:25:26.090442 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:27.09043519 +0000 UTC m=+23.061827177 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:25:26 crc kubenswrapper[4992]: E0131 09:25:26.090468 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:27.090457761 +0000 UTC m=+23.061849768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.359631 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 07:04:49.001354436 +0000 UTC Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.392133 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6"} Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.392190 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6"} Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.392204 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ad5482b02d03c815c046b62698f9b54c2cd882316d2325c0334ebbf2384b7fe5"} Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.394338 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.396043 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1"} Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.396274 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.397629 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c"} Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.397661 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"11039d915f46628000c904544bd61378242843248ddc1011902687eaa8127dce"} Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.399353 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pt7xd" event={"ID":"9caf126d-53ac-498b-97d4-89c3c435805e","Type":"ContainerStarted","Data":"f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278"} Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.399387 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pt7xd" event={"ID":"9caf126d-53ac-498b-97d4-89c3c435805e","Type":"ContainerStarted","Data":"bc480e687272f3027c6185bf5831d1d7e3e69c7f2ff59e325d717bed652e0fb1"} Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.400914 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9c9abd34f545b4fb158f37313ec49152ab27215f0c37a72c34e186db815803cd"} Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.407639 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.420733 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.429048 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.437726 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.448565 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.460621 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.474311 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.486029 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.495509 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.505155 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.512668 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.528886 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.540232 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.555518 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.566492 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.576338 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.588306 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:26 crc kubenswrapper[4992]: I0131 09:25:26.596710 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.058865 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.072229 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.075393 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.076297 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.089281 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.097686 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.097744 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.097769 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.097788 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.097805 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.097843 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:25:29.097819227 +0000 UTC m=+25.069211214 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.097888 4992 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.097927 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:29.09791504 +0000 UTC m=+25.069307027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.097930 4992 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.097950 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.097964 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.097974 4992 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.097983 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.097993 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:29.097977712 +0000 UTC m=+25.069369699 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.097994 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.098006 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:29.097999802 +0000 UTC m=+25.069391789 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.098009 4992 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.098034 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:29.098027793 +0000 UTC m=+25.069419780 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.101258 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.112788 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.125121 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.139005 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.154054 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.164724 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.177544 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.181780 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.182041 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.182121 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.182219 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.182452 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.182622 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.185168 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.185820 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.186456 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.187000 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.187564 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.188065 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.188600 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.189081 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.190978 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.191471 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.191972 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.192968 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.193444 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.194306 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.194821 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.196237 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.196333 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.196798 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.197189 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.198202 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.198787 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.199194 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.200284 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.200699 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.201879 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.202256 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.203401 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.204034 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.205191 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.205745 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.206576 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.207015 4992 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.207106 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.209784 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.210744 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.211159 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.212871 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.213844 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.214389 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.215841 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.216494 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.217407 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.217999 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.218973 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.219539 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.220893 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.221053 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.221459 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.222746 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.223786 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.224512 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.225082 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.225618 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.227032 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.227760 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.228830 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.235037 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.247060 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.259187 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.271403 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.285478 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.304567 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.321655 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.333811 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.360224 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 02:00:25.891775097 +0000 UTC Jan 31 09:25:27 crc kubenswrapper[4992]: E0131 09:25:27.411114 4992 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.702617 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-v7wks"] Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.702884 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bjplh"] Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.703084 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.703606 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.704817 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.705454 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.705671 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.705795 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.706197 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.706265 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.706315 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.706712 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.706571 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.707603 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.715971 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.735635 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.751021 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.765408 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.778888 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.796083 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.805695 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-system-cni-dir\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.805729 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-multus-cni-dir\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.805755 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-cnibin\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.805773 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-var-lib-cni-bin\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.805848 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5qh5\" (UniqueName: \"kubernetes.io/projected/28d252d5-9d5b-422f-baee-f350df5664b6-kube-api-access-x5qh5\") pod \"machine-config-daemon-v7wks\" (UID: \"28d252d5-9d5b-422f-baee-f350df5664b6\") " pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.805886 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-multus-conf-dir\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.805918 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/28d252d5-9d5b-422f-baee-f350df5664b6-rootfs\") pod \"machine-config-daemon-v7wks\" (UID: \"28d252d5-9d5b-422f-baee-f350df5664b6\") " pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.805952 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-multus-socket-dir-parent\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.805967 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6bd42532-8655-4c14-991b-4cc36dea52d5-multus-daemon-config\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.805984 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/28d252d5-9d5b-422f-baee-f350df5664b6-mcd-auth-proxy-config\") pod \"machine-config-daemon-v7wks\" (UID: \"28d252d5-9d5b-422f-baee-f350df5664b6\") " pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.806085 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-run-multus-certs\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.806141 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28d252d5-9d5b-422f-baee-f350df5664b6-proxy-tls\") pod \"machine-config-daemon-v7wks\" (UID: \"28d252d5-9d5b-422f-baee-f350df5664b6\") " pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.806178 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6bd42532-8655-4c14-991b-4cc36dea52d5-cni-binary-copy\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.806200 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-var-lib-kubelet\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.806226 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-os-release\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.806254 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-var-lib-cni-multus\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.806277 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-etc-kubernetes\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.806296 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bfcv\" (UniqueName: \"kubernetes.io/projected/6bd42532-8655-4c14-991b-4cc36dea52d5-kube-api-access-7bfcv\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.806316 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-run-k8s-cni-cncf-io\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.806345 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-run-netns\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.806386 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-hostroot\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.822386 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.836078 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.875362 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907039 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-run-k8s-cni-cncf-io\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907072 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-run-netns\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907097 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-hostroot\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907111 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-cnibin\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907125 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-var-lib-cni-bin\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907147 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-system-cni-dir\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907162 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-multus-cni-dir\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907178 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-multus-conf-dir\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907182 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-run-k8s-cni-cncf-io\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907194 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/28d252d5-9d5b-422f-baee-f350df5664b6-rootfs\") pod \"machine-config-daemon-v7wks\" (UID: \"28d252d5-9d5b-422f-baee-f350df5664b6\") " pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907243 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-run-netns\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907253 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5qh5\" (UniqueName: \"kubernetes.io/projected/28d252d5-9d5b-422f-baee-f350df5664b6-kube-api-access-x5qh5\") pod \"machine-config-daemon-v7wks\" (UID: \"28d252d5-9d5b-422f-baee-f350df5664b6\") " pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907278 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-hostroot\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907290 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-multus-socket-dir-parent\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907312 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-cnibin\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907312 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6bd42532-8655-4c14-991b-4cc36dea52d5-multus-daemon-config\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907336 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/28d252d5-9d5b-422f-baee-f350df5664b6-mcd-auth-proxy-config\") pod \"machine-config-daemon-v7wks\" (UID: \"28d252d5-9d5b-422f-baee-f350df5664b6\") " pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907354 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-run-multus-certs\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907383 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28d252d5-9d5b-422f-baee-f350df5664b6-proxy-tls\") pod \"machine-config-daemon-v7wks\" (UID: \"28d252d5-9d5b-422f-baee-f350df5664b6\") " pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907405 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6bd42532-8655-4c14-991b-4cc36dea52d5-cni-binary-copy\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907456 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-var-lib-kubelet\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907476 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-os-release\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907495 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-var-lib-cni-multus\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907516 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-etc-kubernetes\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907534 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bfcv\" (UniqueName: \"kubernetes.io/projected/6bd42532-8655-4c14-991b-4cc36dea52d5-kube-api-access-7bfcv\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907227 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/28d252d5-9d5b-422f-baee-f350df5664b6-rootfs\") pod \"machine-config-daemon-v7wks\" (UID: \"28d252d5-9d5b-422f-baee-f350df5664b6\") " pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907653 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-run-multus-certs\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907706 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-system-cni-dir\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907707 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-multus-socket-dir-parent\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907739 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-var-lib-cni-bin\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907756 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-var-lib-kubelet\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.907774 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-host-var-lib-cni-multus\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.908021 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-multus-cni-dir\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.908051 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-os-release\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.908074 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-multus-conf-dir\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.908093 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6bd42532-8655-4c14-991b-4cc36dea52d5-etc-kubernetes\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.908582 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/28d252d5-9d5b-422f-baee-f350df5664b6-mcd-auth-proxy-config\") pod \"machine-config-daemon-v7wks\" (UID: \"28d252d5-9d5b-422f-baee-f350df5664b6\") " pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.908617 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6bd42532-8655-4c14-991b-4cc36dea52d5-cni-binary-copy\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.908592 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6bd42532-8655-4c14-991b-4cc36dea52d5-multus-daemon-config\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.912059 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28d252d5-9d5b-422f-baee-f350df5664b6-proxy-tls\") pod \"machine-config-daemon-v7wks\" (UID: \"28d252d5-9d5b-422f-baee-f350df5664b6\") " pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.914760 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.930973 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5qh5\" (UniqueName: \"kubernetes.io/projected/28d252d5-9d5b-422f-baee-f350df5664b6-kube-api-access-x5qh5\") pod \"machine-config-daemon-v7wks\" (UID: \"28d252d5-9d5b-422f-baee-f350df5664b6\") " pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.949923 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bfcv\" (UniqueName: \"kubernetes.io/projected/6bd42532-8655-4c14-991b-4cc36dea52d5-kube-api-access-7bfcv\") pod \"multus-bjplh\" (UID: \"6bd42532-8655-4c14-991b-4cc36dea52d5\") " pod="openshift-multus/multus-bjplh" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.952484 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.983988 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:27 crc kubenswrapper[4992]: I0131 09:25:27.994993 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.005009 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.016326 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bjplh" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.020784 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.024591 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:25:28 crc kubenswrapper[4992]: W0131 09:25:28.034387 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28d252d5_9d5b_422f_baee_f350df5664b6.slice/crio-6b416a68ed8dfcbc26e7a5856b524a9126f421db8f9f2dbb886184f2dd1ca92f WatchSource:0}: Error finding container 6b416a68ed8dfcbc26e7a5856b524a9126f421db8f9f2dbb886184f2dd1ca92f: Status 404 returned error can't find the container with id 6b416a68ed8dfcbc26e7a5856b524a9126f421db8f9f2dbb886184f2dd1ca92f Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.038266 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.056873 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.072819 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.094860 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.110592 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.111290 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-9s7nb"] Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.111992 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.112578 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-46cdx"] Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.113262 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.114792 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.114887 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.117004 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.117217 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.118435 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.118738 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.118864 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.119017 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.119076 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.126748 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.142403 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.156894 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.174278 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.188071 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.202515 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212188 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-cni-binary-copy\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212244 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-run-ovn-kubernetes\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212270 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-cni-bin\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212297 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6939ca32-c541-41c0-ba96-4282b942ff16-ovn-node-metrics-cert\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212319 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dsg2\" (UniqueName: \"kubernetes.io/projected/6939ca32-c541-41c0-ba96-4282b942ff16-kube-api-access-2dsg2\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212347 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-cni-netd\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212400 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-ovnkube-config\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212469 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-systemd\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212486 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-node-log\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212501 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-log-socket\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212521 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212578 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-slash\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212593 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-openvswitch\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212609 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-env-overrides\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212628 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8c2m\" (UniqueName: \"kubernetes.io/projected/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-kube-api-access-h8c2m\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212649 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-var-lib-openvswitch\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212665 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-ovnkube-script-lib\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212685 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212706 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-run-netns\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212723 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-etc-openvswitch\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212738 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-ovn\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212774 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-os-release\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212817 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-kubelet\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212912 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212943 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-systemd-units\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212964 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-system-cni-dir\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.212985 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-cnibin\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.221090 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.236049 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.251826 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.268642 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.290448 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.304596 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314124 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314160 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8c2m\" (UniqueName: \"kubernetes.io/projected/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-kube-api-access-h8c2m\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314175 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-slash\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314193 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-openvswitch\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314211 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-env-overrides\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314227 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-var-lib-openvswitch\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314242 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-ovnkube-script-lib\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314257 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314282 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-os-release\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314295 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-run-netns\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314310 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-etc-openvswitch\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314325 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-ovn\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314346 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-kubelet\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314369 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314384 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-systemd-units\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314412 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-cnibin\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314445 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-system-cni-dir\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314461 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-cni-binary-copy\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314480 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-run-ovn-kubernetes\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314501 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-cni-bin\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314517 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6939ca32-c541-41c0-ba96-4282b942ff16-ovn-node-metrics-cert\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314532 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dsg2\" (UniqueName: \"kubernetes.io/projected/6939ca32-c541-41c0-ba96-4282b942ff16-kube-api-access-2dsg2\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314548 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-cni-netd\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314561 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-systemd\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314579 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-node-log\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314594 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-log-socket\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314607 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-ovnkube-config\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314633 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-var-lib-openvswitch\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314686 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-openvswitch\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314722 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-cnibin\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314752 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314843 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-env-overrides\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314893 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-ovn\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314915 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-run-netns\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314923 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-os-release\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314935 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-etc-openvswitch\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314301 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-slash\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314965 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-kubelet\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.315000 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.315003 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-systemd-units\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.314993 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.315051 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-systemd\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.315084 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-ovnkube-config\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.315121 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-node-log\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.315144 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-log-socket\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.315167 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-system-cni-dir\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.315189 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-run-ovn-kubernetes\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.315211 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-cni-netd\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.315215 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-ovnkube-script-lib\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.315209 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.315233 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-cni-bin\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.315549 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-cni-binary-copy\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.319965 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6939ca32-c541-41c0-ba96-4282b942ff16-ovn-node-metrics-cert\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.330085 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8c2m\" (UniqueName: \"kubernetes.io/projected/fa856ff8-dbc2-46d7-9df9-eb4320bd69a6-kube-api-access-h8c2m\") pod \"multus-additional-cni-plugins-9s7nb\" (UID: \"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\") " pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.331893 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dsg2\" (UniqueName: \"kubernetes.io/projected/6939ca32-c541-41c0-ba96-4282b942ff16-kube-api-access-2dsg2\") pod \"ovnkube-node-46cdx\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.334017 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.345283 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.355830 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.360742 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 09:16:00.104052198 +0000 UTC Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.368992 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.383870 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.407009 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7"} Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.409363 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6"} Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.409452 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0"} Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.409490 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"6b416a68ed8dfcbc26e7a5856b524a9126f421db8f9f2dbb886184f2dd1ca92f"} Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.410543 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bjplh" event={"ID":"6bd42532-8655-4c14-991b-4cc36dea52d5","Type":"ContainerStarted","Data":"29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57"} Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.410600 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bjplh" event={"ID":"6bd42532-8655-4c14-991b-4cc36dea52d5","Type":"ContainerStarted","Data":"ffe5300c99cede9c2ac513901886d90add38cb9e8ae355ea09af6d31babf1a83"} Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.424523 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.425672 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.430677 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.444638 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.458626 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.477301 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.508975 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.532697 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.574203 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.614401 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.650806 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.691695 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.729632 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.781637 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.812006 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.848758 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.889908 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.930446 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:28 crc kubenswrapper[4992]: I0131 09:25:28.972147 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.010967 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.051229 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.094501 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.121997 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.122128 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.122143 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:25:33.122121467 +0000 UTC m=+29.093513464 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.122170 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.122200 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.122230 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.122243 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.122255 4992 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.122278 4992 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.122294 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:33.122283341 +0000 UTC m=+29.093675328 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.122307 4992 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.122230 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.122310 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:33.122302062 +0000 UTC m=+29.093694049 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.122363 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:33.122357123 +0000 UTC m=+29.093749110 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.122456 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.122488 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.122500 4992 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.122556 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:33.122541438 +0000 UTC m=+29.093933425 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.132478 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.170797 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.182441 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.182448 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.182616 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.182471 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.182841 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:29 crc kubenswrapper[4992]: E0131 09:25:29.183018 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.215135 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.260573 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.291925 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.331471 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.364348 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 06:50:31.799236547 +0000 UTC Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.380582 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.413399 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.414588 4992 generic.go:334] "Generic (PLEG): container finished" podID="fa856ff8-dbc2-46d7-9df9-eb4320bd69a6" containerID="5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0" exitCode=0 Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.414627 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" event={"ID":"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6","Type":"ContainerDied","Data":"5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0"} Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.414663 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" event={"ID":"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6","Type":"ContainerStarted","Data":"72945698b4484ca96a9c39d655c647d901f5f0b5967c4774a23b9d975adc8b8f"} Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.415838 4992 generic.go:334] "Generic (PLEG): container finished" podID="6939ca32-c541-41c0-ba96-4282b942ff16" containerID="ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66" exitCode=0 Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.415949 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerDied","Data":"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66"} Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.415989 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerStarted","Data":"706a6d5527482a210ea1d90d8ea86c4f87a85094d0cdbc69b975098a24c405d9"} Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.450942 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.495741 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.529898 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.571549 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.611018 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.651984 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.699631 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.730076 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.769856 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.818589 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.853321 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.892694 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.930413 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.930755 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-9jjrt"] Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.931680 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9jjrt" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.962266 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 31 09:25:29 crc kubenswrapper[4992]: I0131 09:25:29.982802 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.002680 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.039532 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48594b98-9b83-4c95-80a5-5655ce93a260-host\") pod \"node-ca-9jjrt\" (UID: \"48594b98-9b83-4c95-80a5-5655ce93a260\") " pod="openshift-image-registry/node-ca-9jjrt" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.039827 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccftx\" (UniqueName: \"kubernetes.io/projected/48594b98-9b83-4c95-80a5-5655ce93a260-kube-api-access-ccftx\") pod \"node-ca-9jjrt\" (UID: \"48594b98-9b83-4c95-80a5-5655ce93a260\") " pod="openshift-image-registry/node-ca-9jjrt" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.040165 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/48594b98-9b83-4c95-80a5-5655ce93a260-serviceca\") pod \"node-ca-9jjrt\" (UID: \"48594b98-9b83-4c95-80a5-5655ce93a260\") " pod="openshift-image-registry/node-ca-9jjrt" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.040835 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.063281 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.098557 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.134122 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.141596 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48594b98-9b83-4c95-80a5-5655ce93a260-host\") pod \"node-ca-9jjrt\" (UID: \"48594b98-9b83-4c95-80a5-5655ce93a260\") " pod="openshift-image-registry/node-ca-9jjrt" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.141680 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccftx\" (UniqueName: \"kubernetes.io/projected/48594b98-9b83-4c95-80a5-5655ce93a260-kube-api-access-ccftx\") pod \"node-ca-9jjrt\" (UID: \"48594b98-9b83-4c95-80a5-5655ce93a260\") " pod="openshift-image-registry/node-ca-9jjrt" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.141719 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/48594b98-9b83-4c95-80a5-5655ce93a260-serviceca\") pod \"node-ca-9jjrt\" (UID: \"48594b98-9b83-4c95-80a5-5655ce93a260\") " pod="openshift-image-registry/node-ca-9jjrt" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.141712 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48594b98-9b83-4c95-80a5-5655ce93a260-host\") pod \"node-ca-9jjrt\" (UID: \"48594b98-9b83-4c95-80a5-5655ce93a260\") " pod="openshift-image-registry/node-ca-9jjrt" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.142920 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/48594b98-9b83-4c95-80a5-5655ce93a260-serviceca\") pod \"node-ca-9jjrt\" (UID: \"48594b98-9b83-4c95-80a5-5655ce93a260\") " pod="openshift-image-registry/node-ca-9jjrt" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.193578 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccftx\" (UniqueName: \"kubernetes.io/projected/48594b98-9b83-4c95-80a5-5655ce93a260-kube-api-access-ccftx\") pod \"node-ca-9jjrt\" (UID: \"48594b98-9b83-4c95-80a5-5655ce93a260\") " pod="openshift-image-registry/node-ca-9jjrt" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.236189 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.243838 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9jjrt" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.277407 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: W0131 09:25:30.278162 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48594b98_9b83_4c95_80a5_5655ce93a260.slice/crio-ca9de870d557e1f692f8234f94dc340dbd5d285aa9c9a6c00c4adb7720286adf WatchSource:0}: Error finding container ca9de870d557e1f692f8234f94dc340dbd5d285aa9c9a6c00c4adb7720286adf: Status 404 returned error can't find the container with id ca9de870d557e1f692f8234f94dc340dbd5d285aa9c9a6c00c4adb7720286adf Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.298251 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.325818 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.353198 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.364907 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 03:28:25.117262167 +0000 UTC Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.396918 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.429134 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" event={"ID":"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6","Type":"ContainerStarted","Data":"d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080"} Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.431573 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerStarted","Data":"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109"} Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.431598 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerStarted","Data":"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0"} Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.431613 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerStarted","Data":"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e"} Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.431623 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerStarted","Data":"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed"} Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.432835 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9jjrt" event={"ID":"48594b98-9b83-4c95-80a5-5655ce93a260","Type":"ContainerStarted","Data":"ca9de870d557e1f692f8234f94dc340dbd5d285aa9c9a6c00c4adb7720286adf"} Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.438395 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.470567 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.509110 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.550905 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.591763 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.634028 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.672239 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.713566 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.757397 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.791531 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.830587 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.877652 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.912740 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.956152 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:30 crc kubenswrapper[4992]: I0131 09:25:30.994528 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.035223 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.073095 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.112068 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.152912 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.182224 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:31 crc kubenswrapper[4992]: E0131 09:25:31.182448 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.183024 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:31 crc kubenswrapper[4992]: E0131 09:25:31.183161 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.183370 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:31 crc kubenswrapper[4992]: E0131 09:25:31.183497 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.191041 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.231953 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.274554 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.365053 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 04:47:07.873420713 +0000 UTC Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.428741 4992 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.430690 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.430739 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.430752 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.430858 4992 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.436556 4992 generic.go:334] "Generic (PLEG): container finished" podID="fa856ff8-dbc2-46d7-9df9-eb4320bd69a6" containerID="d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080" exitCode=0 Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.436636 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" event={"ID":"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6","Type":"ContainerDied","Data":"d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080"} Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.440135 4992 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.440564 4992 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.442267 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.442308 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.442320 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.442339 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.442810 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerStarted","Data":"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d"} Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.442846 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerStarted","Data":"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1"} Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.442353 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:31Z","lastTransitionTime":"2026-01-31T09:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.444337 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9jjrt" event={"ID":"48594b98-9b83-4c95-80a5-5655ce93a260","Type":"ContainerStarted","Data":"6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75"} Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.461791 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: E0131 09:25:31.464800 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.470008 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.470044 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.470053 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.470068 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.470078 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:31Z","lastTransitionTime":"2026-01-31T09:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.481542 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: E0131 09:25:31.482815 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.488021 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.488075 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.488086 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.488104 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.488115 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:31Z","lastTransitionTime":"2026-01-31T09:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.494396 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: E0131 09:25:31.507448 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.511581 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.511623 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.511637 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.511657 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.511668 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:31Z","lastTransitionTime":"2026-01-31T09:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.517036 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: E0131 09:25:31.527300 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.529265 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.533471 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.533508 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.533516 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.533532 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.533544 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:31Z","lastTransitionTime":"2026-01-31T09:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:31 crc kubenswrapper[4992]: E0131 09:25:31.551607 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: E0131 09:25:31.551738 4992 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.554621 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.554868 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.554908 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.554918 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.554933 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.554942 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:31Z","lastTransitionTime":"2026-01-31T09:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.590201 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.632268 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.656629 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.656664 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.656675 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.656690 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.657144 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:31Z","lastTransitionTime":"2026-01-31T09:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.673078 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.712935 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.752302 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.759755 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.759787 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.759795 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.759807 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.759817 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:31Z","lastTransitionTime":"2026-01-31T09:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.799778 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.832723 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.861807 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.861857 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.861866 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.861880 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.861889 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:31Z","lastTransitionTime":"2026-01-31T09:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.871446 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.915365 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.956672 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.964183 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.964236 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.964253 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.964275 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.964290 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:31Z","lastTransitionTime":"2026-01-31T09:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:31 crc kubenswrapper[4992]: I0131 09:25:31.995090 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.035028 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.066807 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.066870 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.066885 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.066905 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.066920 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:32Z","lastTransitionTime":"2026-01-31T09:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.073696 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.112341 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.151130 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.169674 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.169709 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.169720 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.169737 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.169747 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:32Z","lastTransitionTime":"2026-01-31T09:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.191351 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.237239 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.273258 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.273310 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.273327 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.273350 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.273366 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:32Z","lastTransitionTime":"2026-01-31T09:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.273970 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.309942 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.348855 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.365537 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 17:08:34.846937755 +0000 UTC Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.375218 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.375271 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.375286 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.375309 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.375322 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:32Z","lastTransitionTime":"2026-01-31T09:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.393393 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.436452 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.451209 4992 generic.go:334] "Generic (PLEG): container finished" podID="fa856ff8-dbc2-46d7-9df9-eb4320bd69a6" containerID="a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952" exitCode=0 Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.451311 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" event={"ID":"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6","Type":"ContainerDied","Data":"a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952"} Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.477705 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.477747 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.477769 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.477785 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.477796 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:32Z","lastTransitionTime":"2026-01-31T09:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.480343 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.512959 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.553311 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.579797 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.579831 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.579840 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.579857 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.579867 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:32Z","lastTransitionTime":"2026-01-31T09:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.589986 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.630538 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.673781 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.687011 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.687105 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.687120 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.687137 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.687148 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:32Z","lastTransitionTime":"2026-01-31T09:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.713545 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.751066 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.789199 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.789246 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.789257 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.789272 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.789282 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:32Z","lastTransitionTime":"2026-01-31T09:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.793534 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.838488 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.870769 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.892278 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.892337 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.892348 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.892371 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.892386 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:32Z","lastTransitionTime":"2026-01-31T09:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.913783 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.953827 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.999282 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.999340 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.999365 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.999392 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:32 crc kubenswrapper[4992]: I0131 09:25:32.999406 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:32Z","lastTransitionTime":"2026-01-31T09:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.000832 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.032742 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.075696 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.102157 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.102198 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.102208 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.102253 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.102269 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:33Z","lastTransitionTime":"2026-01-31T09:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.118357 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.172795 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.173062 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:25:41.172984766 +0000 UTC m=+37.144376753 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.173116 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.173177 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.173228 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.173273 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.173461 4992 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.173520 4992 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.173552 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.173575 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.173589 4992 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.173537 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:41.173522481 +0000 UTC m=+37.144914538 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.173634 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:41.173608134 +0000 UTC m=+37.145000201 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.173681 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:41.173664145 +0000 UTC m=+37.145056272 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.173760 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.173777 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.173788 4992 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.173827 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:41.173816789 +0000 UTC m=+37.145208866 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.182558 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.182572 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.182800 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.182880 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.182677 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:33 crc kubenswrapper[4992]: E0131 09:25:33.182981 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.204702 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.204738 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.204749 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.204764 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.204776 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:33Z","lastTransitionTime":"2026-01-31T09:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.307024 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.307068 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.307081 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.307098 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.307109 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:33Z","lastTransitionTime":"2026-01-31T09:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.366577 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 18:28:49.411050226 +0000 UTC Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.409347 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.409385 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.409396 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.409411 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.409442 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:33Z","lastTransitionTime":"2026-01-31T09:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.456533 4992 generic.go:334] "Generic (PLEG): container finished" podID="fa856ff8-dbc2-46d7-9df9-eb4320bd69a6" containerID="3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331" exitCode=0 Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.456660 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" event={"ID":"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6","Type":"ContainerDied","Data":"3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331"} Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.462106 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerStarted","Data":"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511"} Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.477459 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.492800 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.508349 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.515097 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.515133 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.515142 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.515157 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.515166 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:33Z","lastTransitionTime":"2026-01-31T09:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.531435 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.545757 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.560530 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.580309 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.603115 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.617408 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.617489 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.617503 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.617525 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.617552 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:33Z","lastTransitionTime":"2026-01-31T09:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.618883 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.633066 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.646219 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.664851 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.679875 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.694884 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.712359 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.720223 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.720264 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.720275 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.720291 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.720300 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:33Z","lastTransitionTime":"2026-01-31T09:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.822587 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.822621 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.822631 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.822645 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.822655 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:33Z","lastTransitionTime":"2026-01-31T09:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.927268 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.927625 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.927638 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.927660 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:33 crc kubenswrapper[4992]: I0131 09:25:33.927681 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:33Z","lastTransitionTime":"2026-01-31T09:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.029769 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.029808 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.029819 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.029834 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.029845 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:34Z","lastTransitionTime":"2026-01-31T09:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.131852 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.131883 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.131891 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.131906 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.131916 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:34Z","lastTransitionTime":"2026-01-31T09:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.233928 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.233961 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.233969 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.233981 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.233990 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:34Z","lastTransitionTime":"2026-01-31T09:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.336660 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.336739 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.336758 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.337163 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.337222 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:34Z","lastTransitionTime":"2026-01-31T09:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.367232 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 14:27:24.782033474 +0000 UTC Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.439766 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.439822 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.439831 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.439851 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.439863 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:34Z","lastTransitionTime":"2026-01-31T09:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.476567 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" event={"ID":"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6","Type":"ContainerStarted","Data":"891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8"} Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.495384 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.511854 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.523920 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.541189 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.542173 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.542318 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.542482 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.542603 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.542701 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:34Z","lastTransitionTime":"2026-01-31T09:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.563935 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.582401 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.597261 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.616564 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.633757 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.644845 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.644875 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.644884 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.644898 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.644907 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:34Z","lastTransitionTime":"2026-01-31T09:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.648104 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.662777 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.675455 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.692193 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.706484 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.729363 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.747496 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.747535 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.747557 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.747576 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.747605 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:34Z","lastTransitionTime":"2026-01-31T09:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.849821 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.849862 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.849873 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.849890 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.849902 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:34Z","lastTransitionTime":"2026-01-31T09:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.952199 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.952649 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.952663 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.952684 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.952697 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:34Z","lastTransitionTime":"2026-01-31T09:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:34 crc kubenswrapper[4992]: I0131 09:25:34.953071 4992 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.055558 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.055719 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.055919 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.056045 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.056155 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:35Z","lastTransitionTime":"2026-01-31T09:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.159694 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.159750 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.159764 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.159786 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.159799 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:35Z","lastTransitionTime":"2026-01-31T09:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.182668 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:35 crc kubenswrapper[4992]: E0131 09:25:35.182798 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.182866 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:35 crc kubenswrapper[4992]: E0131 09:25:35.182941 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.182668 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:35 crc kubenswrapper[4992]: E0131 09:25:35.183012 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.203216 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.214256 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.228041 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.245946 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.261376 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.262956 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.262980 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.262989 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.263001 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.263028 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:35Z","lastTransitionTime":"2026-01-31T09:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.283433 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.296195 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.325377 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.340237 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.354099 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.366138 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.366181 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.366193 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.366211 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.366223 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:35Z","lastTransitionTime":"2026-01-31T09:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.368254 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 16:26:20.961754914 +0000 UTC Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.368572 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.382077 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.397725 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.412737 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.427519 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.469141 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.469210 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.469222 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.469259 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.469274 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:35Z","lastTransitionTime":"2026-01-31T09:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.485328 4992 generic.go:334] "Generic (PLEG): container finished" podID="fa856ff8-dbc2-46d7-9df9-eb4320bd69a6" containerID="891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8" exitCode=0 Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.485390 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" event={"ID":"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6","Type":"ContainerDied","Data":"891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8"} Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.494363 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerStarted","Data":"654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1"} Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.494774 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.495000 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.511239 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.528090 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.542562 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.556873 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.572003 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.572916 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.572962 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.572973 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.572992 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.573003 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:35Z","lastTransitionTime":"2026-01-31T09:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.582622 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.597063 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.614835 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.626071 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.634540 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.651991 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.662057 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.665026 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.665733 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.673146 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.675922 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.675953 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.675964 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.676001 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.676016 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:35Z","lastTransitionTime":"2026-01-31T09:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.683180 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.692221 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.702246 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.717409 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.731184 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.749014 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.768792 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.779139 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.779182 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.779198 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.779218 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.779232 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:35Z","lastTransitionTime":"2026-01-31T09:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.787475 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.804075 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.836004 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.879821 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.881912 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.881987 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.882006 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.882052 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.882069 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:35Z","lastTransitionTime":"2026-01-31T09:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.912637 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.952915 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.984408 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.984484 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.984497 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.984517 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:35 crc kubenswrapper[4992]: I0131 09:25:35.984532 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:35Z","lastTransitionTime":"2026-01-31T09:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.001492 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.031930 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.077983 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.087190 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.087251 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.087271 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.087295 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.087312 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:36Z","lastTransitionTime":"2026-01-31T09:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.112811 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.190724 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.190763 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.190777 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.190792 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.190803 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:36Z","lastTransitionTime":"2026-01-31T09:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.294062 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.294124 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.294152 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.294180 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.294201 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:36Z","lastTransitionTime":"2026-01-31T09:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.369389 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 23:23:39.855354877 +0000 UTC Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.396549 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.396596 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.396606 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.396623 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.396634 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:36Z","lastTransitionTime":"2026-01-31T09:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.499130 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.499209 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.499233 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.499261 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.499282 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:36Z","lastTransitionTime":"2026-01-31T09:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.502123 4992 generic.go:334] "Generic (PLEG): container finished" podID="fa856ff8-dbc2-46d7-9df9-eb4320bd69a6" containerID="1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8" exitCode=0 Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.502190 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" event={"ID":"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6","Type":"ContainerDied","Data":"1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8"} Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.502300 4992 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.526396 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.537661 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.548362 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.565535 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.583365 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.602013 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.602044 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.602052 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.602065 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.601922 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.602073 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:36Z","lastTransitionTime":"2026-01-31T09:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.615506 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.644791 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.661720 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.676834 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.693073 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.704286 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.704328 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.704340 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.704381 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.704394 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:36Z","lastTransitionTime":"2026-01-31T09:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.707058 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.723125 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.734107 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.756858 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.807158 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.807199 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.807212 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.807230 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.807242 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:36Z","lastTransitionTime":"2026-01-31T09:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.909675 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.909724 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.909781 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.909804 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:36 crc kubenswrapper[4992]: I0131 09:25:36.909821 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:36Z","lastTransitionTime":"2026-01-31T09:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.012120 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.012172 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.012182 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.012198 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.012222 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:37Z","lastTransitionTime":"2026-01-31T09:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.114871 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.114934 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.114960 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.114986 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.115003 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:37Z","lastTransitionTime":"2026-01-31T09:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.182061 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.182148 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.182186 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:37 crc kubenswrapper[4992]: E0131 09:25:37.182226 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:37 crc kubenswrapper[4992]: E0131 09:25:37.182333 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:37 crc kubenswrapper[4992]: E0131 09:25:37.182549 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.218005 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.218072 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.218096 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.218127 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.218148 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:37Z","lastTransitionTime":"2026-01-31T09:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.323214 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.323278 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.323300 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.323330 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.323350 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:37Z","lastTransitionTime":"2026-01-31T09:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.370381 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 11:24:49.136258926 +0000 UTC Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.426378 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.426429 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.426438 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.426454 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.426465 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:37Z","lastTransitionTime":"2026-01-31T09:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.506946 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" event={"ID":"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6","Type":"ContainerStarted","Data":"63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416"} Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.507008 4992 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.522065 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.528347 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.528391 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.528402 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.528438 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.528449 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:37Z","lastTransitionTime":"2026-01-31T09:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.537927 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.550557 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.564986 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.585140 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.597985 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.617166 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.633332 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.635454 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.635567 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.635629 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.635700 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.635756 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:37Z","lastTransitionTime":"2026-01-31T09:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.647672 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.662005 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.674100 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.697077 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.708881 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.723949 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.735622 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.738844 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.738886 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.738895 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.738910 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.738919 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:37Z","lastTransitionTime":"2026-01-31T09:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.841304 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.841341 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.841352 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.841369 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.841381 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:37Z","lastTransitionTime":"2026-01-31T09:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.943797 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.943843 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.943851 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.943866 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:37 crc kubenswrapper[4992]: I0131 09:25:37.943875 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:37Z","lastTransitionTime":"2026-01-31T09:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.046814 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.046867 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.046880 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.046899 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.046912 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:38Z","lastTransitionTime":"2026-01-31T09:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.151094 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.151386 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.151398 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.151436 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.151448 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:38Z","lastTransitionTime":"2026-01-31T09:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.255403 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.255461 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.255473 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.255492 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.255503 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:38Z","lastTransitionTime":"2026-01-31T09:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.359358 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.359473 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.359491 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.359517 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.359536 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:38Z","lastTransitionTime":"2026-01-31T09:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.370858 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 19:04:29.717549552 +0000 UTC Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.462535 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.462575 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.462587 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.462608 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.462624 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:38Z","lastTransitionTime":"2026-01-31T09:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.513912 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/0.log" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.517749 4992 generic.go:334] "Generic (PLEG): container finished" podID="6939ca32-c541-41c0-ba96-4282b942ff16" containerID="654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1" exitCode=1 Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.517839 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerDied","Data":"654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1"} Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.519264 4992 scope.go:117] "RemoveContainer" containerID="654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.538102 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:38Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.557938 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:38Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.564948 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.564984 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.564993 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.565008 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.565018 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:38Z","lastTransitionTime":"2026-01-31T09:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.576545 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:38Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.600402 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:38Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.626245 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:38Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.646896 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:38Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.667371 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.667701 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.667799 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.667898 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.667981 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:38Z","lastTransitionTime":"2026-01-31T09:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.673809 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:38Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.692504 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:38Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.711762 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:38Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.738565 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:38Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.755990 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:38Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.768588 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:38Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.770642 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.770689 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.770709 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.770736 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.770759 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:38Z","lastTransitionTime":"2026-01-31T09:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.812563 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:38Z\\\",\\\"message\\\":\\\"lector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034358 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:25:38.034403 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:25:38.034496 6281 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034565 6281 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034575 6281 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034560 6281 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034644 6281 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034713 6281 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0131 09:25:38.035197 6281 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:38Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.833143 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:38Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.849317 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:38Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.872989 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.873026 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.873034 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.873048 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.873057 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:38Z","lastTransitionTime":"2026-01-31T09:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.975072 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.975112 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.975124 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.975141 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:38 crc kubenswrapper[4992]: I0131 09:25:38.975152 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:38Z","lastTransitionTime":"2026-01-31T09:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.077656 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.077697 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.077711 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.077727 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.077740 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:39Z","lastTransitionTime":"2026-01-31T09:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.180439 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.180490 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.180503 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.180522 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.180535 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:39Z","lastTransitionTime":"2026-01-31T09:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.182000 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.182022 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.182052 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:39 crc kubenswrapper[4992]: E0131 09:25:39.182136 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:39 crc kubenswrapper[4992]: E0131 09:25:39.182218 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:39 crc kubenswrapper[4992]: E0131 09:25:39.182336 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.283577 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.283655 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.283671 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.283695 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.283715 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:39Z","lastTransitionTime":"2026-01-31T09:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.371917 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 06:49:51.222517517 +0000 UTC Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.386516 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.386585 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.386605 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.386633 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.386653 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:39Z","lastTransitionTime":"2026-01-31T09:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.490154 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.490193 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.490205 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.490221 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.490235 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:39Z","lastTransitionTime":"2026-01-31T09:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.524751 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/0.log" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.528074 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerStarted","Data":"f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa"} Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.528277 4992 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.543067 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.560519 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.572959 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.587871 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.592905 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.592968 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.592985 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.593009 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.593027 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:39Z","lastTransitionTime":"2026-01-31T09:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.621146 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts"] Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.622022 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.624076 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.625255 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.626683 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.650366 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.672565 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.689718 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.696326 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.696372 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.696380 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.696395 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.696404 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:39Z","lastTransitionTime":"2026-01-31T09:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.709101 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.733625 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.752686 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0d127d3-476a-4068-a55a-919fcd4b187d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-k66ts\" (UID: \"b0d127d3-476a-4068-a55a-919fcd4b187d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.752794 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0d127d3-476a-4068-a55a-919fcd4b187d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-k66ts\" (UID: \"b0d127d3-476a-4068-a55a-919fcd4b187d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.752986 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v92lv\" (UniqueName: \"kubernetes.io/projected/b0d127d3-476a-4068-a55a-919fcd4b187d-kube-api-access-v92lv\") pod \"ovnkube-control-plane-749d76644c-k66ts\" (UID: \"b0d127d3-476a-4068-a55a-919fcd4b187d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.753173 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0d127d3-476a-4068-a55a-919fcd4b187d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-k66ts\" (UID: \"b0d127d3-476a-4068-a55a-919fcd4b187d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.759508 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.784744 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.794968 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.799406 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.799580 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.799705 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.799841 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.799977 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:39Z","lastTransitionTime":"2026-01-31T09:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.816700 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:38Z\\\",\\\"message\\\":\\\"lector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034358 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:25:38.034403 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:25:38.034496 6281 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034565 6281 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034575 6281 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034560 6281 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034644 6281 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034713 6281 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0131 09:25:38.035197 6281 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.831321 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.844722 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.854258 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0d127d3-476a-4068-a55a-919fcd4b187d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-k66ts\" (UID: \"b0d127d3-476a-4068-a55a-919fcd4b187d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.854336 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0d127d3-476a-4068-a55a-919fcd4b187d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-k66ts\" (UID: \"b0d127d3-476a-4068-a55a-919fcd4b187d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.854390 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v92lv\" (UniqueName: \"kubernetes.io/projected/b0d127d3-476a-4068-a55a-919fcd4b187d-kube-api-access-v92lv\") pod \"ovnkube-control-plane-749d76644c-k66ts\" (UID: \"b0d127d3-476a-4068-a55a-919fcd4b187d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.854473 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0d127d3-476a-4068-a55a-919fcd4b187d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-k66ts\" (UID: \"b0d127d3-476a-4068-a55a-919fcd4b187d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.855508 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0d127d3-476a-4068-a55a-919fcd4b187d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-k66ts\" (UID: \"b0d127d3-476a-4068-a55a-919fcd4b187d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.855699 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0d127d3-476a-4068-a55a-919fcd4b187d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-k66ts\" (UID: \"b0d127d3-476a-4068-a55a-919fcd4b187d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.859286 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.862609 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0d127d3-476a-4068-a55a-919fcd4b187d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-k66ts\" (UID: \"b0d127d3-476a-4068-a55a-919fcd4b187d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.877078 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.885024 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v92lv\" (UniqueName: \"kubernetes.io/projected/b0d127d3-476a-4068-a55a-919fcd4b187d-kube-api-access-v92lv\") pod \"ovnkube-control-plane-749d76644c-k66ts\" (UID: \"b0d127d3-476a-4068-a55a-919fcd4b187d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.891802 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.903765 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.903824 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.903838 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.903856 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.903869 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:39Z","lastTransitionTime":"2026-01-31T09:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.908370 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.923537 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.938872 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.941144 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.960129 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:39 crc kubenswrapper[4992]: W0131 09:25:39.960234 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0d127d3_476a_4068_a55a_919fcd4b187d.slice/crio-7648e320d76ce6b5d6e08c93dfbcdcfebcdf9e52744d1936a58afccc7f9e9abe WatchSource:0}: Error finding container 7648e320d76ce6b5d6e08c93dfbcdcfebcdf9e52744d1936a58afccc7f9e9abe: Status 404 returned error can't find the container with id 7648e320d76ce6b5d6e08c93dfbcdcfebcdf9e52744d1936a58afccc7f9e9abe Jan 31 09:25:39 crc kubenswrapper[4992]: I0131 09:25:39.979853 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.000786 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.006766 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.006814 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.006826 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.006844 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.006857 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:40Z","lastTransitionTime":"2026-01-31T09:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.021735 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.036669 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.055231 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.075347 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.089896 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.109437 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.109491 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.109502 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.109522 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.109572 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:40Z","lastTransitionTime":"2026-01-31T09:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.119199 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:38Z\\\",\\\"message\\\":\\\"lector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034358 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:25:38.034403 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:25:38.034496 6281 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034565 6281 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034575 6281 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034560 6281 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034644 6281 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034713 6281 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0131 09:25:38.035197 6281 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.211876 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.212218 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.212231 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.212248 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.212259 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:40Z","lastTransitionTime":"2026-01-31T09:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.315630 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.315686 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.315703 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.315727 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.315748 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:40Z","lastTransitionTime":"2026-01-31T09:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.372497 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 21:36:19.961105519 +0000 UTC Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.418982 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.419032 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.419041 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.419055 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.419064 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:40Z","lastTransitionTime":"2026-01-31T09:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.522305 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.522359 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.522373 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.522390 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.522403 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:40Z","lastTransitionTime":"2026-01-31T09:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.533990 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/1.log" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.534962 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/0.log" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.537594 4992 generic.go:334] "Generic (PLEG): container finished" podID="6939ca32-c541-41c0-ba96-4282b942ff16" containerID="f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa" exitCode=1 Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.537714 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerDied","Data":"f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa"} Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.537801 4992 scope.go:117] "RemoveContainer" containerID="654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.539162 4992 scope.go:117] "RemoveContainer" containerID="f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa" Jan 31 09:25:40 crc kubenswrapper[4992]: E0131 09:25:40.539481 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\"" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.543961 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" event={"ID":"b0d127d3-476a-4068-a55a-919fcd4b187d","Type":"ContainerStarted","Data":"632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483"} Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.544688 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" event={"ID":"b0d127d3-476a-4068-a55a-919fcd4b187d","Type":"ContainerStarted","Data":"e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c"} Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.544739 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" event={"ID":"b0d127d3-476a-4068-a55a-919fcd4b187d","Type":"ContainerStarted","Data":"7648e320d76ce6b5d6e08c93dfbcdcfebcdf9e52744d1936a58afccc7f9e9abe"} Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.555314 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.571253 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.584933 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.603025 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.621466 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.626101 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.626171 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.626191 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.626217 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.626232 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:40Z","lastTransitionTime":"2026-01-31T09:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.638093 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.652248 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.669046 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.687874 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.703230 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.724786 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.729108 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.729162 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.729181 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.729208 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.729226 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:40Z","lastTransitionTime":"2026-01-31T09:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.744131 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.760506 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.781094 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:38Z\\\",\\\"message\\\":\\\"lector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034358 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:25:38.034403 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:25:38.034496 6281 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034565 6281 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034575 6281 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034560 6281 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034644 6281 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034713 6281 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0131 09:25:38.035197 6281 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:39.367859 6446 services_controller.go:454] Service openshift-route-controller-manager/route-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0131 09:25:39.367876 6446 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:25:39.367851 6446 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Se\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.795103 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.810757 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.823539 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.831195 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.831238 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.831250 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.831267 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.831279 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:40Z","lastTransitionTime":"2026-01-31T09:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.839028 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.852236 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.868761 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.902723 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.927903 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.934908 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.934968 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.934985 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.935011 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.935027 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:40Z","lastTransitionTime":"2026-01-31T09:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.945945 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.963597 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.976675 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:40 crc kubenswrapper[4992]: I0131 09:25:40.992125 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.011965 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.022568 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.042187 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:38Z\\\",\\\"message\\\":\\\"lector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034358 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:25:38.034403 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:25:38.034496 6281 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034565 6281 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034575 6281 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034560 6281 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034644 6281 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034713 6281 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0131 09:25:38.035197 6281 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:39.367859 6446 services_controller.go:454] Service openshift-route-controller-manager/route-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0131 09:25:39.367876 6446 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:25:39.367851 6446 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Se\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.042454 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.042521 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.042538 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.042563 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.042579 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:41Z","lastTransitionTime":"2026-01-31T09:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.054306 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.067576 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.079267 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.144686 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-bplq6"] Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.145699 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.145773 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.146193 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.146238 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.146250 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.146269 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.146282 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:41Z","lastTransitionTime":"2026-01-31T09:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.161393 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.175872 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.182545 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.183259 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.183569 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.183731 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.183929 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.184037 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.191287 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.202775 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.218509 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.235930 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.248915 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.249000 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.249017 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.249073 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.249092 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:41Z","lastTransitionTime":"2026-01-31T09:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.252539 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.267149 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.270413 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.270545 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.270613 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:25:57.270583691 +0000 UTC m=+53.241975678 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.270663 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.270668 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.270691 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.270704 4992 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.270748 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs\") pod \"network-metrics-daemon-bplq6\" (UID: \"afb1d129-e6bb-4db2-8204-3a1f4d91048e\") " pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.270774 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.270796 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.270817 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9nkh\" (UniqueName: \"kubernetes.io/projected/afb1d129-e6bb-4db2-8204-3a1f4d91048e-kube-api-access-c9nkh\") pod \"network-metrics-daemon-bplq6\" (UID: \"afb1d129-e6bb-4db2-8204-3a1f4d91048e\") " pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.270912 4992 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.270942 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:57.270935621 +0000 UTC m=+53.242327598 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.270961 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:57.270955231 +0000 UTC m=+53.242347218 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.271009 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.271022 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.271032 4992 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.271053 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:57.271046064 +0000 UTC m=+53.242438051 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.271075 4992 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.271092 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:25:57.271087245 +0000 UTC m=+53.242479232 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.284280 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.309156 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.325636 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.345796 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.351953 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.351997 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.352007 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.352024 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.352036 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:41Z","lastTransitionTime":"2026-01-31T09:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.371808 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.372084 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs\") pod \"network-metrics-daemon-bplq6\" (UID: \"afb1d129-e6bb-4db2-8204-3a1f4d91048e\") " pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.372192 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9nkh\" (UniqueName: \"kubernetes.io/projected/afb1d129-e6bb-4db2-8204-3a1f4d91048e-kube-api-access-c9nkh\") pod \"network-metrics-daemon-bplq6\" (UID: \"afb1d129-e6bb-4db2-8204-3a1f4d91048e\") " pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.372380 4992 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.372501 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs podName:afb1d129-e6bb-4db2-8204-3a1f4d91048e nodeName:}" failed. No retries permitted until 2026-01-31 09:25:41.872469906 +0000 UTC m=+37.843861893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs") pod "network-metrics-daemon-bplq6" (UID: "afb1d129-e6bb-4db2-8204-3a1f4d91048e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.372647 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 16:08:26.707981368 +0000 UTC Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.391775 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.401304 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9nkh\" (UniqueName: \"kubernetes.io/projected/afb1d129-e6bb-4db2-8204-3a1f4d91048e-kube-api-access-c9nkh\") pod \"network-metrics-daemon-bplq6\" (UID: \"afb1d129-e6bb-4db2-8204-3a1f4d91048e\") " pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.407593 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.422312 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.445203 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:38Z\\\",\\\"message\\\":\\\"lector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034358 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:25:38.034403 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:25:38.034496 6281 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034565 6281 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034575 6281 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034560 6281 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034644 6281 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034713 6281 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0131 09:25:38.035197 6281 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:39.367859 6446 services_controller.go:454] Service openshift-route-controller-manager/route-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0131 09:25:39.367876 6446 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:25:39.367851 6446 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Se\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.455494 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.455560 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.455579 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.455623 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.455648 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:41Z","lastTransitionTime":"2026-01-31T09:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.551898 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/1.log" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.557700 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.557740 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.557749 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.557765 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.557776 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:41Z","lastTransitionTime":"2026-01-31T09:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.659877 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.660003 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.660026 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.660044 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.660057 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:41Z","lastTransitionTime":"2026-01-31T09:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.661582 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.661730 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.661813 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.661897 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.661966 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:41Z","lastTransitionTime":"2026-01-31T09:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.679526 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.685242 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.685354 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.685441 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.685554 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.685612 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:41Z","lastTransitionTime":"2026-01-31T09:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.700720 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.706412 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.706598 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.706658 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.706751 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.706819 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:41Z","lastTransitionTime":"2026-01-31T09:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.720444 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.726242 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.726289 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.726300 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.726317 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.726329 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:41Z","lastTransitionTime":"2026-01-31T09:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.743332 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.748699 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.748751 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.748767 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.748789 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.748806 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:41Z","lastTransitionTime":"2026-01-31T09:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.763903 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.764157 4992 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.766340 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.766380 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.766397 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.766441 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.766457 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:41Z","lastTransitionTime":"2026-01-31T09:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.872678 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.872752 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.872774 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.873368 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.873401 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:41Z","lastTransitionTime":"2026-01-31T09:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.878801 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs\") pod \"network-metrics-daemon-bplq6\" (UID: \"afb1d129-e6bb-4db2-8204-3a1f4d91048e\") " pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.879019 4992 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:25:41 crc kubenswrapper[4992]: E0131 09:25:41.879102 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs podName:afb1d129-e6bb-4db2-8204-3a1f4d91048e nodeName:}" failed. No retries permitted until 2026-01-31 09:25:42.87908052 +0000 UTC m=+38.850472507 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs") pod "network-metrics-daemon-bplq6" (UID: "afb1d129-e6bb-4db2-8204-3a1f4d91048e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.976671 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.977171 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.977255 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.977340 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:41 crc kubenswrapper[4992]: I0131 09:25:41.977461 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:41Z","lastTransitionTime":"2026-01-31T09:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.080309 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.080360 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.080382 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.080404 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.080449 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:42Z","lastTransitionTime":"2026-01-31T09:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.182712 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.182778 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.182796 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.182819 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.182837 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:42Z","lastTransitionTime":"2026-01-31T09:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.286143 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.286201 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.286214 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.286233 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.286248 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:42Z","lastTransitionTime":"2026-01-31T09:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.372935 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 11:56:00.486737434 +0000 UTC Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.390018 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.390080 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.390095 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.390118 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.390133 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:42Z","lastTransitionTime":"2026-01-31T09:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.493814 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.493883 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.493898 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.493916 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.493926 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:42Z","lastTransitionTime":"2026-01-31T09:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.608567 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.608636 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.608646 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.608665 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.608676 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:42Z","lastTransitionTime":"2026-01-31T09:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.711621 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.711677 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.711686 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.711705 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.711716 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:42Z","lastTransitionTime":"2026-01-31T09:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.814707 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.814766 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.814779 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.814800 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.814816 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:42Z","lastTransitionTime":"2026-01-31T09:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.889110 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.898596 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs\") pod \"network-metrics-daemon-bplq6\" (UID: \"afb1d129-e6bb-4db2-8204-3a1f4d91048e\") " pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:42 crc kubenswrapper[4992]: E0131 09:25:42.898801 4992 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:25:42 crc kubenswrapper[4992]: E0131 09:25:42.898886 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs podName:afb1d129-e6bb-4db2-8204-3a1f4d91048e nodeName:}" failed. No retries permitted until 2026-01-31 09:25:44.89886286 +0000 UTC m=+40.870254847 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs") pod "network-metrics-daemon-bplq6" (UID: "afb1d129-e6bb-4db2-8204-3a1f4d91048e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.906102 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.920334 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.926269 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.926338 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.926353 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.926378 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.926482 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:42Z","lastTransitionTime":"2026-01-31T09:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.935074 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:42 crc kubenswrapper[4992]: I0131 09:25:42.989345 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.006113 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.022962 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.029095 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.029161 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.029171 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.029192 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.029202 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:43Z","lastTransitionTime":"2026-01-31T09:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.039905 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.064788 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.080131 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.093441 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.114062 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.129204 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.132637 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.132724 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.132739 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.132763 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.132804 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:43Z","lastTransitionTime":"2026-01-31T09:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.144077 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.165711 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:38Z\\\",\\\"message\\\":\\\"lector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034358 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:25:38.034403 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:25:38.034496 6281 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034565 6281 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034575 6281 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034560 6281 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034644 6281 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034713 6281 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0131 09:25:38.035197 6281 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:39.367859 6446 services_controller.go:454] Service openshift-route-controller-manager/route-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0131 09:25:39.367876 6446 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:25:39.367851 6446 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Se\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.183011 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.183011 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.183021 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.183161 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:43 crc kubenswrapper[4992]: E0131 09:25:43.183265 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:43 crc kubenswrapper[4992]: E0131 09:25:43.183397 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:25:43 crc kubenswrapper[4992]: E0131 09:25:43.183485 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:43 crc kubenswrapper[4992]: E0131 09:25:43.183628 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.185097 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.201989 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.216498 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.236125 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.236181 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.236192 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.236208 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.236225 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:43Z","lastTransitionTime":"2026-01-31T09:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.338987 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.339040 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.339052 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.339071 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.339082 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:43Z","lastTransitionTime":"2026-01-31T09:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.373548 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 15:32:05.955653522 +0000 UTC Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.441688 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.441766 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.441781 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.441810 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.441825 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:43Z","lastTransitionTime":"2026-01-31T09:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.546240 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.546314 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.546336 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.546368 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.546390 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:43Z","lastTransitionTime":"2026-01-31T09:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.650087 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.650159 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.650181 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.650209 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.650228 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:43Z","lastTransitionTime":"2026-01-31T09:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.753076 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.753149 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.753162 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.753183 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.753197 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:43Z","lastTransitionTime":"2026-01-31T09:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.855584 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.855629 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.855641 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.855659 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.855671 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:43Z","lastTransitionTime":"2026-01-31T09:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.958545 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.958621 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.958641 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.958666 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:43 crc kubenswrapper[4992]: I0131 09:25:43.958686 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:43Z","lastTransitionTime":"2026-01-31T09:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.061675 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.061747 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.061765 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.061790 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.061809 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:44Z","lastTransitionTime":"2026-01-31T09:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.164359 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.164692 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.164765 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.164799 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.164813 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:44Z","lastTransitionTime":"2026-01-31T09:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.268449 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.268501 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.268531 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.268556 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.268570 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:44Z","lastTransitionTime":"2026-01-31T09:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.371294 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.371333 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.371369 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.371388 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.371399 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:44Z","lastTransitionTime":"2026-01-31T09:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.374680 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 10:41:41.271157683 +0000 UTC Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.475233 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.475277 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.475300 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.475322 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.475338 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:44Z","lastTransitionTime":"2026-01-31T09:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.578902 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.578961 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.578974 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.578989 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.579001 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:44Z","lastTransitionTime":"2026-01-31T09:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.681818 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.681886 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.681899 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.681920 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.681934 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:44Z","lastTransitionTime":"2026-01-31T09:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.784868 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.784924 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.784943 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.784967 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.784985 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:44Z","lastTransitionTime":"2026-01-31T09:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.888969 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.889049 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.889074 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.889106 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.889131 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:44Z","lastTransitionTime":"2026-01-31T09:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.926214 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs\") pod \"network-metrics-daemon-bplq6\" (UID: \"afb1d129-e6bb-4db2-8204-3a1f4d91048e\") " pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:44 crc kubenswrapper[4992]: E0131 09:25:44.926526 4992 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:25:44 crc kubenswrapper[4992]: E0131 09:25:44.926662 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs podName:afb1d129-e6bb-4db2-8204-3a1f4d91048e nodeName:}" failed. No retries permitted until 2026-01-31 09:25:48.926631444 +0000 UTC m=+44.898023511 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs") pod "network-metrics-daemon-bplq6" (UID: "afb1d129-e6bb-4db2-8204-3a1f4d91048e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.992622 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.992689 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.992705 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.992729 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:44 crc kubenswrapper[4992]: I0131 09:25:44.992744 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:44Z","lastTransitionTime":"2026-01-31T09:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.095614 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.095695 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.095719 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.095750 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.095773 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:45Z","lastTransitionTime":"2026-01-31T09:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.182309 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.182415 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:45 crc kubenswrapper[4992]: E0131 09:25:45.182560 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.182651 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.182676 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:45 crc kubenswrapper[4992]: E0131 09:25:45.182769 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:45 crc kubenswrapper[4992]: E0131 09:25:45.182937 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:45 crc kubenswrapper[4992]: E0131 09:25:45.183088 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.198277 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.198316 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.198327 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.198341 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.198352 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:45Z","lastTransitionTime":"2026-01-31T09:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.206253 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.223945 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.238268 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.252002 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.263215 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.281024 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.294874 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.301356 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.301445 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.301470 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.301501 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.301561 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:45Z","lastTransitionTime":"2026-01-31T09:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.317471 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.360809 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.374955 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 11:55:42.452921903 +0000 UTC Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.375372 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.402764 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:38Z\\\",\\\"message\\\":\\\"lector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034358 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:25:38.034403 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:25:38.034496 6281 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034565 6281 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034575 6281 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034560 6281 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034644 6281 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034713 6281 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0131 09:25:38.035197 6281 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:39.367859 6446 services_controller.go:454] Service openshift-route-controller-manager/route-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0131 09:25:39.367876 6446 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:25:39.367851 6446 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Se\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.405225 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.405264 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.405285 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.405321 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.405339 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:45Z","lastTransitionTime":"2026-01-31T09:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.422190 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.441266 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.456597 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.470646 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.484747 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.501720 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:45Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.508950 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.509013 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.509031 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.509058 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.509076 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:45Z","lastTransitionTime":"2026-01-31T09:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.612570 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.612631 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.612650 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.612678 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.612696 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:45Z","lastTransitionTime":"2026-01-31T09:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.716351 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.716470 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.716506 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.716532 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.716553 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:45Z","lastTransitionTime":"2026-01-31T09:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.819506 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.819611 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.819629 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.819661 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.819680 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:45Z","lastTransitionTime":"2026-01-31T09:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.922926 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.922994 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.923030 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.923071 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:45 crc kubenswrapper[4992]: I0131 09:25:45.923095 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:45Z","lastTransitionTime":"2026-01-31T09:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.026403 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.026491 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.026533 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.026557 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.026569 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:46Z","lastTransitionTime":"2026-01-31T09:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.128985 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.129035 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.129044 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.129070 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.129081 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:46Z","lastTransitionTime":"2026-01-31T09:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.231752 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.231816 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.231833 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.231858 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.231877 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:46Z","lastTransitionTime":"2026-01-31T09:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.335185 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.335242 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.335263 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.335283 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.335295 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:46Z","lastTransitionTime":"2026-01-31T09:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.375756 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 02:25:37.62961855 +0000 UTC Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.438606 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.438667 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.438692 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.438717 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.438734 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:46Z","lastTransitionTime":"2026-01-31T09:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.542762 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.542832 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.542849 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.542889 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.542907 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:46Z","lastTransitionTime":"2026-01-31T09:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.646026 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.646113 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.646137 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.646169 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.646196 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:46Z","lastTransitionTime":"2026-01-31T09:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.749097 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.749154 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.749172 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.749197 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.749215 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:46Z","lastTransitionTime":"2026-01-31T09:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.852512 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.852580 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.852600 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.852627 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.852646 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:46Z","lastTransitionTime":"2026-01-31T09:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.956111 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.956175 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.956193 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.956217 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:46 crc kubenswrapper[4992]: I0131 09:25:46.956238 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:46Z","lastTransitionTime":"2026-01-31T09:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.063613 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.063702 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.063721 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.063757 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.063776 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:47Z","lastTransitionTime":"2026-01-31T09:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.167337 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.167515 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.167593 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.167626 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.167719 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:47Z","lastTransitionTime":"2026-01-31T09:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.181658 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.181721 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:47 crc kubenswrapper[4992]: E0131 09:25:47.181806 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.181658 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.181868 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:47 crc kubenswrapper[4992]: E0131 09:25:47.182065 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:47 crc kubenswrapper[4992]: E0131 09:25:47.182146 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:47 crc kubenswrapper[4992]: E0131 09:25:47.182238 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.270959 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.271030 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.271051 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.271076 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.271094 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:47Z","lastTransitionTime":"2026-01-31T09:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.374588 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.374653 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.374675 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.374702 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.374722 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:47Z","lastTransitionTime":"2026-01-31T09:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.376791 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 22:10:59.58503904 +0000 UTC Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.479577 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.479653 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.479675 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.479706 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.479731 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:47Z","lastTransitionTime":"2026-01-31T09:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.582318 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.582365 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.582382 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.582406 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.582454 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:47Z","lastTransitionTime":"2026-01-31T09:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.686060 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.686116 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.686128 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.686149 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.686163 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:47Z","lastTransitionTime":"2026-01-31T09:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.789259 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.789323 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.789355 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.789384 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.789407 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:47Z","lastTransitionTime":"2026-01-31T09:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.892972 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.893022 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.893041 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.893064 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.893084 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:47Z","lastTransitionTime":"2026-01-31T09:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.996278 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.996308 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.996319 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.996333 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:47 crc kubenswrapper[4992]: I0131 09:25:47.996343 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:47Z","lastTransitionTime":"2026-01-31T09:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.099073 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.099144 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.099167 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.099198 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.099218 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:48Z","lastTransitionTime":"2026-01-31T09:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.202454 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.202505 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.202519 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.202545 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.202557 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:48Z","lastTransitionTime":"2026-01-31T09:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.305874 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.305948 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.305962 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.305981 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.305993 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:48Z","lastTransitionTime":"2026-01-31T09:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.377544 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 20:33:22.920245848 +0000 UTC Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.409957 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.410006 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.410020 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.410042 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.410238 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:48Z","lastTransitionTime":"2026-01-31T09:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.513520 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.513807 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.513815 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.513830 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.513840 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:48Z","lastTransitionTime":"2026-01-31T09:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.616325 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.616348 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.616356 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.616369 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.616378 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:48Z","lastTransitionTime":"2026-01-31T09:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.719379 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.719486 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.719503 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.719528 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.719544 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:48Z","lastTransitionTime":"2026-01-31T09:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.822464 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.822535 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.822561 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.822591 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.822625 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:48Z","lastTransitionTime":"2026-01-31T09:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.927042 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.927113 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.927134 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.927162 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.927191 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:48Z","lastTransitionTime":"2026-01-31T09:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:48 crc kubenswrapper[4992]: I0131 09:25:48.971507 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs\") pod \"network-metrics-daemon-bplq6\" (UID: \"afb1d129-e6bb-4db2-8204-3a1f4d91048e\") " pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:48 crc kubenswrapper[4992]: E0131 09:25:48.971812 4992 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:25:48 crc kubenswrapper[4992]: E0131 09:25:48.971974 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs podName:afb1d129-e6bb-4db2-8204-3a1f4d91048e nodeName:}" failed. No retries permitted until 2026-01-31 09:25:56.97194065 +0000 UTC m=+52.943332677 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs") pod "network-metrics-daemon-bplq6" (UID: "afb1d129-e6bb-4db2-8204-3a1f4d91048e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.030171 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.030230 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.030243 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.030261 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.030275 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:49Z","lastTransitionTime":"2026-01-31T09:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.134079 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.134522 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.134619 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.134714 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.134819 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:49Z","lastTransitionTime":"2026-01-31T09:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.181917 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.182074 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:49 crc kubenswrapper[4992]: E0131 09:25:49.182495 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.182302 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.182158 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:49 crc kubenswrapper[4992]: E0131 09:25:49.182933 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:25:49 crc kubenswrapper[4992]: E0131 09:25:49.182970 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:49 crc kubenswrapper[4992]: E0131 09:25:49.182637 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.238176 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.238254 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.238277 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.238326 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.238360 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:49Z","lastTransitionTime":"2026-01-31T09:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.341348 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.341408 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.341463 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.341490 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.341506 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:49Z","lastTransitionTime":"2026-01-31T09:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.378403 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 01:15:49.749157416 +0000 UTC Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.444958 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.445006 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.445016 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.445034 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.445046 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:49Z","lastTransitionTime":"2026-01-31T09:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.548537 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.548617 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.548634 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.548656 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.548689 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:49Z","lastTransitionTime":"2026-01-31T09:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.652211 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.652270 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.652287 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.652310 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.652331 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:49Z","lastTransitionTime":"2026-01-31T09:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.755607 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.755643 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.755653 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.755668 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.755678 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:49Z","lastTransitionTime":"2026-01-31T09:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.858885 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.858955 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.858975 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.859001 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.859018 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:49Z","lastTransitionTime":"2026-01-31T09:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.961716 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.961850 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.961865 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.961882 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:49 crc kubenswrapper[4992]: I0131 09:25:49.961896 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:49Z","lastTransitionTime":"2026-01-31T09:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.064956 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.065003 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.065014 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.065031 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.065043 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:50Z","lastTransitionTime":"2026-01-31T09:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.168097 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.168159 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.168180 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.168208 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.168230 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:50Z","lastTransitionTime":"2026-01-31T09:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.270859 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.270899 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.270913 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.270930 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.270943 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:50Z","lastTransitionTime":"2026-01-31T09:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.374130 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.374176 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.374191 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.374211 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.374225 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:50Z","lastTransitionTime":"2026-01-31T09:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.379376 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 12:17:27.304066136 +0000 UTC Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.477163 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.477236 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.477287 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.477318 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.477341 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:50Z","lastTransitionTime":"2026-01-31T09:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.580791 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.580858 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.580882 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.580913 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.580936 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:50Z","lastTransitionTime":"2026-01-31T09:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.685149 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.685197 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.685209 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.685227 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.685241 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:50Z","lastTransitionTime":"2026-01-31T09:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.787852 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.788201 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.788232 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.788260 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.788284 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:50Z","lastTransitionTime":"2026-01-31T09:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.891062 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.891354 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.891453 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.891538 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.891665 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:50Z","lastTransitionTime":"2026-01-31T09:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.995776 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.995826 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.995837 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.995856 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:50 crc kubenswrapper[4992]: I0131 09:25:50.995870 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:50Z","lastTransitionTime":"2026-01-31T09:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.098273 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.098350 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.098372 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.098406 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.098452 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:51Z","lastTransitionTime":"2026-01-31T09:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.181707 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.181741 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.181792 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:51 crc kubenswrapper[4992]: E0131 09:25:51.181982 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.182034 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:51 crc kubenswrapper[4992]: E0131 09:25:51.182170 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:51 crc kubenswrapper[4992]: E0131 09:25:51.182322 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:51 crc kubenswrapper[4992]: E0131 09:25:51.182517 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.200969 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.201032 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.201050 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.201075 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.201094 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:51Z","lastTransitionTime":"2026-01-31T09:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.304094 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.304164 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.304201 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.304225 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.304242 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:51Z","lastTransitionTime":"2026-01-31T09:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.379618 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 10:05:41.090714282 +0000 UTC Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.407688 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.407737 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.407749 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.407768 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.407780 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:51Z","lastTransitionTime":"2026-01-31T09:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.510587 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.510654 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.510675 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.510703 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.510721 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:51Z","lastTransitionTime":"2026-01-31T09:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.613157 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.613203 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.613216 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.613232 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.613244 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:51Z","lastTransitionTime":"2026-01-31T09:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.716273 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.716318 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.716331 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.716350 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.716364 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:51Z","lastTransitionTime":"2026-01-31T09:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.819885 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.820222 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.820302 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.820397 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.820528 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:51Z","lastTransitionTime":"2026-01-31T09:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.869809 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.869933 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.869958 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.869991 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.870014 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:51Z","lastTransitionTime":"2026-01-31T09:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:51 crc kubenswrapper[4992]: E0131 09:25:51.886991 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:51Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.894143 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.894533 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.894692 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.894804 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.894894 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:51Z","lastTransitionTime":"2026-01-31T09:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:51 crc kubenswrapper[4992]: E0131 09:25:51.913528 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:51Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.918782 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.919080 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.919174 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.919727 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.919832 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:51Z","lastTransitionTime":"2026-01-31T09:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:51 crc kubenswrapper[4992]: E0131 09:25:51.939571 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:51Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.944236 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.944289 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.944308 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.944332 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.944350 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:51Z","lastTransitionTime":"2026-01-31T09:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:51 crc kubenswrapper[4992]: E0131 09:25:51.959502 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:51Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.963934 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.964036 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.964054 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.964083 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.964101 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:51Z","lastTransitionTime":"2026-01-31T09:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:51 crc kubenswrapper[4992]: E0131 09:25:51.978364 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:51Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:51 crc kubenswrapper[4992]: E0131 09:25:51.978588 4992 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.981131 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.981174 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.981187 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.981211 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:51 crc kubenswrapper[4992]: I0131 09:25:51.981224 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:51Z","lastTransitionTime":"2026-01-31T09:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.084872 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.085235 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.085362 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.085526 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.085646 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:52Z","lastTransitionTime":"2026-01-31T09:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.189058 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.189568 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.189745 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.189973 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.190162 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:52Z","lastTransitionTime":"2026-01-31T09:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.293294 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.293352 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.293369 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.293394 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.293413 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:52Z","lastTransitionTime":"2026-01-31T09:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.380680 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 04:53:13.723666238 +0000 UTC Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.396151 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.396233 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.396249 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.396342 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.396363 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:52Z","lastTransitionTime":"2026-01-31T09:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.503749 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.503826 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.503838 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.503861 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.503877 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:52Z","lastTransitionTime":"2026-01-31T09:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.527360 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.546493 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.553095 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.568732 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.583822 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.600001 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.606971 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.607016 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.607028 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.607051 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.607065 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:52Z","lastTransitionTime":"2026-01-31T09:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.617109 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.635534 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.656136 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.673108 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.697043 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.710054 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.710106 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.710124 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.710148 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.710167 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:52Z","lastTransitionTime":"2026-01-31T09:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.715584 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.732719 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.759398 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.774603 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.788713 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.806905 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.813193 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.813231 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.813239 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.813254 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.813264 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:52Z","lastTransitionTime":"2026-01-31T09:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.822889 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.853722 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654201f5bac73f208e3e5843ddaca2a548cd25755f900350efe1754d6d2a20b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:38Z\\\",\\\"message\\\":\\\"lector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034358 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:25:38.034403 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:25:38.034496 6281 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034565 6281 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034575 6281 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:25:38.034560 6281 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034644 6281 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:25:38.034713 6281 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0131 09:25:38.035197 6281 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:39.367859 6446 services_controller.go:454] Service openshift-route-controller-manager/route-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0131 09:25:39.367876 6446 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:25:39.367851 6446 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Se\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.916071 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.916133 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.916152 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.916178 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:52 crc kubenswrapper[4992]: I0131 09:25:52.916195 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:52Z","lastTransitionTime":"2026-01-31T09:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.018979 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.019043 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.019060 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.019085 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.019104 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:53Z","lastTransitionTime":"2026-01-31T09:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.122795 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.122839 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.122852 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.122871 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.122884 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:53Z","lastTransitionTime":"2026-01-31T09:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.182553 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.182553 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.182747 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:53 crc kubenswrapper[4992]: E0131 09:25:53.182900 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.182920 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:53 crc kubenswrapper[4992]: E0131 09:25:53.183049 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:53 crc kubenswrapper[4992]: E0131 09:25:53.183559 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:53 crc kubenswrapper[4992]: E0131 09:25:53.184176 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.185627 4992 scope.go:117] "RemoveContainer" containerID="f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.202974 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.219681 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.225212 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.225407 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.225577 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.225751 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.225914 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:53Z","lastTransitionTime":"2026-01-31T09:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.233310 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.264139 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:39.367859 6446 services_controller.go:454] Service openshift-route-controller-manager/route-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0131 09:25:39.367876 6446 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:25:39.367851 6446 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Se\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.281513 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.298236 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.314656 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.329320 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.329367 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.329379 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.329401 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.329444 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:53Z","lastTransitionTime":"2026-01-31T09:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.333263 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.352406 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.371744 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1785b018-a3bb-45a6-97e7-4027373f6c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02672dc43413e911b9a81cd11509509bfe92fb72dd403eb8b052cdeceffb0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d5bf660214bf3a60b1f2b2dc1be26fadbc1eca2cb41156bda68db68583f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c6368831eaa04c16890f0f8cd508e6a25399ea0d7a2f56c16b6210902207c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.381712 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 14:48:18.518296545 +0000 UTC Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.395985 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.430730 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.432391 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.432411 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.432441 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.432460 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.432472 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:53Z","lastTransitionTime":"2026-01-31T09:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.447715 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.473693 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.488877 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.501658 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.517918 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.533840 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.534549 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.534604 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.534638 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.534663 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.534679 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:53Z","lastTransitionTime":"2026-01-31T09:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.604021 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/1.log" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.619189 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerStarted","Data":"72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156"} Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.619380 4992 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.629836 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.637806 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.637870 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.637884 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.637902 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.637913 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:53Z","lastTransitionTime":"2026-01-31T09:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.656570 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:39.367859 6446 services_controller.go:454] Service openshift-route-controller-manager/route-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0131 09:25:39.367876 6446 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:25:39.367851 6446 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Se\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.674657 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.694018 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.708775 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.727011 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.741170 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.741234 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.741247 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.741268 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.741281 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:53Z","lastTransitionTime":"2026-01-31T09:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.743035 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.764002 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.787621 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.805541 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.814268 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.838121 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.844364 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.844446 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.844464 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.844487 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.844501 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:53Z","lastTransitionTime":"2026-01-31T09:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.856768 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.869367 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1785b018-a3bb-45a6-97e7-4027373f6c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02672dc43413e911b9a81cd11509509bfe92fb72dd403eb8b052cdeceffb0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d5bf660214bf3a60b1f2b2dc1be26fadbc1eca2cb41156bda68db68583f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c6368831eaa04c16890f0f8cd508e6a25399ea0d7a2f56c16b6210902207c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.882378 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.897343 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.923880 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.946274 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.947482 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.947529 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.947545 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.947567 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.947581 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:53Z","lastTransitionTime":"2026-01-31T09:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:53 crc kubenswrapper[4992]: I0131 09:25:53.966196 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.050836 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.050886 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.050900 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.050920 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.050934 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:54Z","lastTransitionTime":"2026-01-31T09:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.155035 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.155079 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.155088 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.155103 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.155113 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:54Z","lastTransitionTime":"2026-01-31T09:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.258758 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.258838 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.258864 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.258897 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.258922 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:54Z","lastTransitionTime":"2026-01-31T09:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.361970 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.362376 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.362397 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.362452 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.362513 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:54Z","lastTransitionTime":"2026-01-31T09:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.382262 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 19:00:50.678887045 +0000 UTC Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.465767 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.465856 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.465884 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.465914 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.465934 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:54Z","lastTransitionTime":"2026-01-31T09:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.570037 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.570168 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.570280 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.570323 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.570353 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:54Z","lastTransitionTime":"2026-01-31T09:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.626985 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/2.log" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.628139 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/1.log" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.632322 4992 generic.go:334] "Generic (PLEG): container finished" podID="6939ca32-c541-41c0-ba96-4282b942ff16" containerID="72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156" exitCode=1 Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.632463 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerDied","Data":"72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156"} Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.632547 4992 scope.go:117] "RemoveContainer" containerID="f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.633825 4992 scope.go:117] "RemoveContainer" containerID="72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156" Jan 31 09:25:54 crc kubenswrapper[4992]: E0131 09:25:54.634177 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\"" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.667273 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.673839 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.673913 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.673933 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.673963 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.673984 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:54Z","lastTransitionTime":"2026-01-31T09:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.688607 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.709208 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.733750 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.752123 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.771089 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.778022 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.778531 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.778794 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.779184 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.779531 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:54Z","lastTransitionTime":"2026-01-31T09:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.792062 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.809903 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.842831 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:39.367859 6446 services_controller.go:454] Service openshift-route-controller-manager/route-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0131 09:25:39.367876 6446 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:25:39.367851 6446 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Se\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:54Z\\\",\\\"message\\\":\\\"254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236036 6642 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:25:54.236058 6642 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236241 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:25:54.236331 6642 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.862294 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.881328 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.884355 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.884619 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.884669 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.884700 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.885034 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:54Z","lastTransitionTime":"2026-01-31T09:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.898388 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.919593 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.943043 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.963035 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1785b018-a3bb-45a6-97e7-4027373f6c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02672dc43413e911b9a81cd11509509bfe92fb72dd403eb8b052cdeceffb0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d5bf660214bf3a60b1f2b2dc1be26fadbc1eca2cb41156bda68db68583f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c6368831eaa04c16890f0f8cd508e6a25399ea0d7a2f56c16b6210902207c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.985514 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:54Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.988436 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.988482 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.988496 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.988520 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:54 crc kubenswrapper[4992]: I0131 09:25:54.988538 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:54Z","lastTransitionTime":"2026-01-31T09:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.008412 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.033402 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.091706 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.091774 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.091795 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.091821 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.091838 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:55Z","lastTransitionTime":"2026-01-31T09:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.182315 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.182464 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:55 crc kubenswrapper[4992]: E0131 09:25:55.182625 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.182672 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.182678 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:55 crc kubenswrapper[4992]: E0131 09:25:55.182818 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:55 crc kubenswrapper[4992]: E0131 09:25:55.183014 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:25:55 crc kubenswrapper[4992]: E0131 09:25:55.183170 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.194939 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.194996 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.195011 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.195031 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.195047 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:55Z","lastTransitionTime":"2026-01-31T09:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.207048 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.226561 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.243080 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.259914 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.280856 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.297372 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1785b018-a3bb-45a6-97e7-4027373f6c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02672dc43413e911b9a81cd11509509bfe92fb72dd403eb8b052cdeceffb0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d5bf660214bf3a60b1f2b2dc1be26fadbc1eca2cb41156bda68db68583f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c6368831eaa04c16890f0f8cd508e6a25399ea0d7a2f56c16b6210902207c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.298124 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.298163 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.298198 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.298220 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.298235 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:55Z","lastTransitionTime":"2026-01-31T09:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.316511 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.336269 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.352227 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.378535 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.382478 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 00:55:02.544514626 +0000 UTC Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.398732 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.401606 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.401656 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.401693 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.401719 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.401736 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:55Z","lastTransitionTime":"2026-01-31T09:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.421959 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.443139 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.459536 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.481267 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.497158 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.504629 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.504677 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.504723 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.504746 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.504760 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:55Z","lastTransitionTime":"2026-01-31T09:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.510549 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.538385 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f99e66bbfe6a69b6a73b02a6de3adc7d7834336385f065b0594d55bbb6969baa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:39.367859 6446 services_controller.go:454] Service openshift-route-controller-manager/route-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0131 09:25:39.367876 6446 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:39Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:25:39.367851 6446 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Se\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:54Z\\\",\\\"message\\\":\\\"254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236036 6642 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:25:54.236058 6642 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236241 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:25:54.236331 6642 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.607722 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.607787 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.607803 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.607897 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.607954 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:55Z","lastTransitionTime":"2026-01-31T09:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.639002 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/2.log" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.643111 4992 scope.go:117] "RemoveContainer" containerID="72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156" Jan 31 09:25:55 crc kubenswrapper[4992]: E0131 09:25:55.643358 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\"" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.663832 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.686854 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.703995 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.711492 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.711545 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.711566 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.711593 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.711612 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:55Z","lastTransitionTime":"2026-01-31T09:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.738945 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.755923 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.768842 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.796296 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:54Z\\\",\\\"message\\\":\\\"254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236036 6642 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:25:54.236058 6642 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236241 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:25:54.236331 6642 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.811510 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.817042 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.817073 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.817083 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.817100 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.817112 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:55Z","lastTransitionTime":"2026-01-31T09:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.827664 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.840341 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.853266 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.867005 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.880734 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.897660 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.914289 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.919809 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.919857 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.919867 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.919883 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.919896 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:55Z","lastTransitionTime":"2026-01-31T09:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.934145 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.952521 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:55 crc kubenswrapper[4992]: I0131 09:25:55.968920 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1785b018-a3bb-45a6-97e7-4027373f6c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02672dc43413e911b9a81cd11509509bfe92fb72dd403eb8b052cdeceffb0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d5bf660214bf3a60b1f2b2dc1be26fadbc1eca2cb41156bda68db68583f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c6368831eaa04c16890f0f8cd508e6a25399ea0d7a2f56c16b6210902207c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:25:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.023761 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.023849 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.023868 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.023907 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.023932 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:56Z","lastTransitionTime":"2026-01-31T09:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.127275 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.127343 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.127362 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.127388 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.127409 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:56Z","lastTransitionTime":"2026-01-31T09:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.230488 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.230523 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.230532 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.230547 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.230558 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:56Z","lastTransitionTime":"2026-01-31T09:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.333498 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.333584 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.333608 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.333632 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.333647 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:56Z","lastTransitionTime":"2026-01-31T09:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.383561 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 04:45:06.470337838 +0000 UTC Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.436843 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.436917 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.436937 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.436965 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.436985 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:56Z","lastTransitionTime":"2026-01-31T09:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.540274 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.540375 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.540395 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.540432 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.540447 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:56Z","lastTransitionTime":"2026-01-31T09:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.644094 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.644162 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.644185 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.644212 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.644234 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:56Z","lastTransitionTime":"2026-01-31T09:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.747831 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.747887 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.747900 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.747924 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.747937 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:56Z","lastTransitionTime":"2026-01-31T09:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.851031 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.851083 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.851096 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.851114 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.851126 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:56Z","lastTransitionTime":"2026-01-31T09:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.954386 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.954459 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.954469 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.954488 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:56 crc kubenswrapper[4992]: I0131 09:25:56.954499 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:56Z","lastTransitionTime":"2026-01-31T09:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.057115 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.057162 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.057174 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.057193 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.057205 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:57Z","lastTransitionTime":"2026-01-31T09:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.065045 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs\") pod \"network-metrics-daemon-bplq6\" (UID: \"afb1d129-e6bb-4db2-8204-3a1f4d91048e\") " pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.065252 4992 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.065336 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs podName:afb1d129-e6bb-4db2-8204-3a1f4d91048e nodeName:}" failed. No retries permitted until 2026-01-31 09:26:13.065312288 +0000 UTC m=+69.036704275 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs") pod "network-metrics-daemon-bplq6" (UID: "afb1d129-e6bb-4db2-8204-3a1f4d91048e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.159945 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.160002 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.160014 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.160050 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.160064 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:57Z","lastTransitionTime":"2026-01-31T09:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.182749 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.182956 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.183034 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.183043 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.182749 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.183168 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.183303 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.183391 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.263891 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.263934 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.263945 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.263961 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.263973 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:57Z","lastTransitionTime":"2026-01-31T09:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.367882 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.367981 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.368019 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.368045 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.368125 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.368159 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.368161 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.368184 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.368213 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.368201 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:57Z","lastTransitionTime":"2026-01-31T09:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.368411 4992 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.368527 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:26:29.368508654 +0000 UTC m=+85.339900641 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.368534 4992 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.368624 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.368677 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.368677 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.368723 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.368637 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:26:29.368627597 +0000 UTC m=+85.340019594 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.368699 4992 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.368753 4992 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.368757 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:26:29.368747511 +0000 UTC m=+85.340139498 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.368846 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:26:29.368814792 +0000 UTC m=+85.340206809 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:57 crc kubenswrapper[4992]: E0131 09:25:57.368882 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:26:29.368866344 +0000 UTC m=+85.340258361 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.383900 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 00:58:30.338747814 +0000 UTC Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.471902 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.471955 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.471966 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.471986 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.472001 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:57Z","lastTransitionTime":"2026-01-31T09:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.575934 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.576008 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.576026 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.576058 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.576082 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:57Z","lastTransitionTime":"2026-01-31T09:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.679762 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.679821 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.679829 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.679849 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.679860 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:57Z","lastTransitionTime":"2026-01-31T09:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.782041 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.782136 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.782160 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.782188 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.782207 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:57Z","lastTransitionTime":"2026-01-31T09:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.885795 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.885866 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.885883 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.885909 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.885927 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:57Z","lastTransitionTime":"2026-01-31T09:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.993461 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.993543 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.993558 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.993579 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:57 crc kubenswrapper[4992]: I0131 09:25:57.993629 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:57Z","lastTransitionTime":"2026-01-31T09:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.098155 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.098219 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.098238 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.098262 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.098280 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:58Z","lastTransitionTime":"2026-01-31T09:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.201769 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.201838 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.201857 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.201883 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.201902 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:58Z","lastTransitionTime":"2026-01-31T09:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.305413 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.305493 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.305503 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.305522 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.305532 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:58Z","lastTransitionTime":"2026-01-31T09:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.385059 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 18:21:57.927405701 +0000 UTC Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.408134 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.408583 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.408839 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.409006 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.409115 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:58Z","lastTransitionTime":"2026-01-31T09:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.512518 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.512571 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.512585 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.512604 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.512619 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:58Z","lastTransitionTime":"2026-01-31T09:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.615073 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.615122 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.615137 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.615171 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.615186 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:58Z","lastTransitionTime":"2026-01-31T09:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.718672 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.718751 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.718773 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.718806 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.718828 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:58Z","lastTransitionTime":"2026-01-31T09:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.822445 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.822509 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.822525 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.822551 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.822603 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:58Z","lastTransitionTime":"2026-01-31T09:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.925510 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.925564 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.925575 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.925594 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:58 crc kubenswrapper[4992]: I0131 09:25:58.925609 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:58Z","lastTransitionTime":"2026-01-31T09:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.028263 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.028329 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.028344 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.028368 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.028385 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:59Z","lastTransitionTime":"2026-01-31T09:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.131949 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.132023 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.132048 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.132083 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.132108 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:59Z","lastTransitionTime":"2026-01-31T09:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.181900 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.181994 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.182035 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.181900 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:25:59 crc kubenswrapper[4992]: E0131 09:25:59.182207 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:25:59 crc kubenswrapper[4992]: E0131 09:25:59.182308 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:25:59 crc kubenswrapper[4992]: E0131 09:25:59.182384 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:25:59 crc kubenswrapper[4992]: E0131 09:25:59.182519 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.235733 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.235831 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.235863 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.235900 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.235929 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:59Z","lastTransitionTime":"2026-01-31T09:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.339508 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.339610 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.339623 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.339644 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.339658 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:59Z","lastTransitionTime":"2026-01-31T09:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.385992 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 09:35:37.640722453 +0000 UTC Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.441754 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.441819 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.441841 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.441862 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.441875 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:59Z","lastTransitionTime":"2026-01-31T09:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.544888 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.544927 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.544936 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.544952 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.544967 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:59Z","lastTransitionTime":"2026-01-31T09:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.648098 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.648149 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.648163 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.648186 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.648198 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:59Z","lastTransitionTime":"2026-01-31T09:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.751659 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.751734 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.751776 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.751817 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.751844 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:59Z","lastTransitionTime":"2026-01-31T09:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.855231 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.855284 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.855302 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.855328 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.855346 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:59Z","lastTransitionTime":"2026-01-31T09:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.958474 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.958548 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.958571 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.958597 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:25:59 crc kubenswrapper[4992]: I0131 09:25:59.958615 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:25:59Z","lastTransitionTime":"2026-01-31T09:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.062297 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.062345 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.062356 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.062373 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.062398 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:00Z","lastTransitionTime":"2026-01-31T09:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.164889 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.164927 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.164937 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.164952 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.164964 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:00Z","lastTransitionTime":"2026-01-31T09:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.266842 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.266902 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.266928 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.266949 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.266962 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:00Z","lastTransitionTime":"2026-01-31T09:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.369580 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.369679 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.369696 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.369720 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.369733 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:00Z","lastTransitionTime":"2026-01-31T09:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.387085 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 10:13:56.228883189 +0000 UTC Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.472364 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.472410 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.472452 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.472476 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.472490 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:00Z","lastTransitionTime":"2026-01-31T09:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.579134 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.579169 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.579180 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.579198 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.579213 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:00Z","lastTransitionTime":"2026-01-31T09:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.682239 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.682284 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.682295 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.682312 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.682325 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:00Z","lastTransitionTime":"2026-01-31T09:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.785230 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.785267 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.785275 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.785292 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.785302 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:00Z","lastTransitionTime":"2026-01-31T09:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.888535 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.888588 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.888604 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.888626 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.888643 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:00Z","lastTransitionTime":"2026-01-31T09:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.991338 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.991393 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.991408 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.991447 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:00 crc kubenswrapper[4992]: I0131 09:26:00.991465 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:00Z","lastTransitionTime":"2026-01-31T09:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.095281 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.095328 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.095339 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.095355 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.095365 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:01Z","lastTransitionTime":"2026-01-31T09:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.182277 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.182461 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.182480 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.182586 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:01 crc kubenswrapper[4992]: E0131 09:26:01.182831 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:01 crc kubenswrapper[4992]: E0131 09:26:01.182953 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:01 crc kubenswrapper[4992]: E0131 09:26:01.182726 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:01 crc kubenswrapper[4992]: E0131 09:26:01.183155 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.198836 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.198933 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.198956 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.198986 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.199006 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:01Z","lastTransitionTime":"2026-01-31T09:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.301850 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.301899 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.301911 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.301930 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.301944 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:01Z","lastTransitionTime":"2026-01-31T09:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.388079 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 01:06:15.980579605 +0000 UTC Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.403895 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.403932 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.403944 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.403961 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.403972 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:01Z","lastTransitionTime":"2026-01-31T09:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.507513 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.507550 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.507561 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.507577 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.507588 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:01Z","lastTransitionTime":"2026-01-31T09:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.611511 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.611548 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.611557 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.611574 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.611585 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:01Z","lastTransitionTime":"2026-01-31T09:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.714462 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.714518 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.714531 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.714549 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.714560 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:01Z","lastTransitionTime":"2026-01-31T09:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.818054 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.818109 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.818121 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.818141 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.818155 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:01Z","lastTransitionTime":"2026-01-31T09:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.921612 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.921658 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.921671 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.921697 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:01 crc kubenswrapper[4992]: I0131 09:26:01.921713 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:01Z","lastTransitionTime":"2026-01-31T09:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.024961 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.025023 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.025038 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.025060 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.025075 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:02Z","lastTransitionTime":"2026-01-31T09:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.026725 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.026804 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.026816 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.026840 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.026855 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:02Z","lastTransitionTime":"2026-01-31T09:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:02 crc kubenswrapper[4992]: E0131 09:26:02.046245 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.050385 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.050443 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.050458 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.050479 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.050497 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:02Z","lastTransitionTime":"2026-01-31T09:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:02 crc kubenswrapper[4992]: E0131 09:26:02.064940 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.069734 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.069812 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.069837 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.069869 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.069892 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:02Z","lastTransitionTime":"2026-01-31T09:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:02 crc kubenswrapper[4992]: E0131 09:26:02.087137 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.091704 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.091753 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.091771 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.091796 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.091816 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:02Z","lastTransitionTime":"2026-01-31T09:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:02 crc kubenswrapper[4992]: E0131 09:26:02.108642 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.113166 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.113224 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.113237 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.113255 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.113270 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:02Z","lastTransitionTime":"2026-01-31T09:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:02 crc kubenswrapper[4992]: E0131 09:26:02.131244 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:02 crc kubenswrapper[4992]: E0131 09:26:02.131516 4992 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.133666 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.133722 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.133734 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.133751 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.133762 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:02Z","lastTransitionTime":"2026-01-31T09:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.236766 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.236825 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.236837 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.236857 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.236873 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:02Z","lastTransitionTime":"2026-01-31T09:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.340208 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.340268 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.340285 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.340313 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.340330 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:02Z","lastTransitionTime":"2026-01-31T09:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.388474 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 22:44:28.489092551 +0000 UTC Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.443352 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.443428 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.443439 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.443460 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.443491 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:02Z","lastTransitionTime":"2026-01-31T09:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.547095 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.547139 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.547147 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.547165 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.547184 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:02Z","lastTransitionTime":"2026-01-31T09:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.650789 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.650832 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.650842 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.650863 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.650874 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:02Z","lastTransitionTime":"2026-01-31T09:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.754616 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.754663 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.754675 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.754693 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.754704 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:02Z","lastTransitionTime":"2026-01-31T09:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.857475 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.857538 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.857557 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.857582 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.857600 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:02Z","lastTransitionTime":"2026-01-31T09:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.960527 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.960582 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.960595 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.960613 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:02 crc kubenswrapper[4992]: I0131 09:26:02.960627 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:02Z","lastTransitionTime":"2026-01-31T09:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.063728 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.063801 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.063812 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.063832 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.063863 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:03Z","lastTransitionTime":"2026-01-31T09:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.167180 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.167230 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.167242 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.167263 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.167276 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:03Z","lastTransitionTime":"2026-01-31T09:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.182163 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.182229 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.182288 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:03 crc kubenswrapper[4992]: E0131 09:26:03.182385 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.182409 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:03 crc kubenswrapper[4992]: E0131 09:26:03.182697 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:03 crc kubenswrapper[4992]: E0131 09:26:03.182726 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:03 crc kubenswrapper[4992]: E0131 09:26:03.182912 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.270845 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.270909 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.270922 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.270947 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.270962 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:03Z","lastTransitionTime":"2026-01-31T09:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.374495 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.374548 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.374560 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.374580 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.374593 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:03Z","lastTransitionTime":"2026-01-31T09:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.389464 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 06:23:38.256589364 +0000 UTC Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.488748 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.488827 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.488844 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.488873 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.488891 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:03Z","lastTransitionTime":"2026-01-31T09:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.592802 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.592870 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.592884 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.592904 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.592918 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:03Z","lastTransitionTime":"2026-01-31T09:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.695960 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.696014 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.696026 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.696043 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.696055 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:03Z","lastTransitionTime":"2026-01-31T09:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.799679 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.799727 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.799739 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.799757 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.799769 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:03Z","lastTransitionTime":"2026-01-31T09:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.902338 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.902374 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.902383 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.902397 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:03 crc kubenswrapper[4992]: I0131 09:26:03.902406 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:03Z","lastTransitionTime":"2026-01-31T09:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.005020 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.005081 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.005095 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.005114 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.005129 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:04Z","lastTransitionTime":"2026-01-31T09:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.108999 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.109354 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.109460 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.109562 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.109660 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:04Z","lastTransitionTime":"2026-01-31T09:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.212132 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.212192 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.212204 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.212221 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.212231 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:04Z","lastTransitionTime":"2026-01-31T09:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.315352 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.315450 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.315471 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.315496 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.315513 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:04Z","lastTransitionTime":"2026-01-31T09:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.390032 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 16:29:04.69831519 +0000 UTC Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.418788 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.418843 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.418860 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.418884 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.418902 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:04Z","lastTransitionTime":"2026-01-31T09:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.521284 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.521342 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.521362 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.521385 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.521402 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:04Z","lastTransitionTime":"2026-01-31T09:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.624860 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.624944 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.624963 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.624996 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.625016 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:04Z","lastTransitionTime":"2026-01-31T09:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.728633 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.728693 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.728711 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.728734 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.728753 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:04Z","lastTransitionTime":"2026-01-31T09:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.832557 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.832611 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.832627 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.832648 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.832668 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:04Z","lastTransitionTime":"2026-01-31T09:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.935923 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.935990 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.936007 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.936032 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:04 crc kubenswrapper[4992]: I0131 09:26:04.936050 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:04Z","lastTransitionTime":"2026-01-31T09:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.039485 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.039575 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.039601 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.039634 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.039658 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:05Z","lastTransitionTime":"2026-01-31T09:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.141867 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.141922 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.141940 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.141961 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.141977 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:05Z","lastTransitionTime":"2026-01-31T09:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.181858 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.181971 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.182032 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:05 crc kubenswrapper[4992]: E0131 09:26:05.182173 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.182197 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:05 crc kubenswrapper[4992]: E0131 09:26:05.182394 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:05 crc kubenswrapper[4992]: E0131 09:26:05.182568 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:05 crc kubenswrapper[4992]: E0131 09:26:05.182669 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.202867 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.224551 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.238915 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.245147 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.245398 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.245589 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.245766 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.245918 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:05Z","lastTransitionTime":"2026-01-31T09:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.255389 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.276707 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.297749 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.321822 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.334809 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.348609 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.348652 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.348663 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.348681 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.348717 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:05Z","lastTransitionTime":"2026-01-31T09:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.363305 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:54Z\\\",\\\"message\\\":\\\"254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236036 6642 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:25:54.236058 6642 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236241 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:25:54.236331 6642 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.376744 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.391057 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 04:23:18.862862966 +0000 UTC Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.393177 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.409158 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.422723 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.441542 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.451356 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.451410 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.451449 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.451507 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.451521 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:05Z","lastTransitionTime":"2026-01-31T09:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.458332 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.476642 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1785b018-a3bb-45a6-97e7-4027373f6c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02672dc43413e911b9a81cd11509509bfe92fb72dd403eb8b052cdeceffb0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d5bf660214bf3a60b1f2b2dc1be26fadbc1eca2cb41156bda68db68583f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c6368831eaa04c16890f0f8cd508e6a25399ea0d7a2f56c16b6210902207c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.495452 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.516359 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:05Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.553930 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.553985 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.554001 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.554021 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.554036 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:05Z","lastTransitionTime":"2026-01-31T09:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.658292 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.658769 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.658889 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.659037 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.659149 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:05Z","lastTransitionTime":"2026-01-31T09:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.761964 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.762330 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.762508 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.762613 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.762739 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:05Z","lastTransitionTime":"2026-01-31T09:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.866469 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.866515 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.866526 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.866546 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.866559 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:05Z","lastTransitionTime":"2026-01-31T09:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.970508 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.971239 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.971299 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.971330 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:05 crc kubenswrapper[4992]: I0131 09:26:05.971348 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:05Z","lastTransitionTime":"2026-01-31T09:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.105811 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.105853 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.105875 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.105894 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.105906 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:06Z","lastTransitionTime":"2026-01-31T09:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.208067 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.208103 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.208125 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.208142 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.208154 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:06Z","lastTransitionTime":"2026-01-31T09:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.310842 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.310901 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.310914 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.310938 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.310952 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:06Z","lastTransitionTime":"2026-01-31T09:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.391380 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 07:57:00.407581979 +0000 UTC Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.414260 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.414295 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.414303 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.414318 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.414328 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:06Z","lastTransitionTime":"2026-01-31T09:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.516993 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.517070 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.517097 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.517148 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.517172 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:06Z","lastTransitionTime":"2026-01-31T09:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.620544 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.620648 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.620672 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.620705 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.620732 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:06Z","lastTransitionTime":"2026-01-31T09:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.723979 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.724028 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.724064 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.724087 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.724101 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:06Z","lastTransitionTime":"2026-01-31T09:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.826604 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.826641 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.826650 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.826666 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.826676 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:06Z","lastTransitionTime":"2026-01-31T09:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.935513 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.935618 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.935642 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.935673 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:06 crc kubenswrapper[4992]: I0131 09:26:06.935693 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:06Z","lastTransitionTime":"2026-01-31T09:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.038953 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.039031 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.039051 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.039081 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.039101 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:07Z","lastTransitionTime":"2026-01-31T09:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.143068 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.143130 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.143148 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.143172 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.143191 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:07Z","lastTransitionTime":"2026-01-31T09:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.182536 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.182656 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:07 crc kubenswrapper[4992]: E0131 09:26:07.182729 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.182766 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.182666 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:07 crc kubenswrapper[4992]: E0131 09:26:07.182923 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:07 crc kubenswrapper[4992]: E0131 09:26:07.183166 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:07 crc kubenswrapper[4992]: E0131 09:26:07.183201 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.246214 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.246263 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.246275 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.246296 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.246309 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:07Z","lastTransitionTime":"2026-01-31T09:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.349363 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.349414 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.349451 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.349472 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.349484 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:07Z","lastTransitionTime":"2026-01-31T09:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.391807 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 12:48:56.057980253 +0000 UTC Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.453023 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.453070 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.453082 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.453102 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.453115 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:07Z","lastTransitionTime":"2026-01-31T09:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.556169 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.556211 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.556222 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.556238 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.556249 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:07Z","lastTransitionTime":"2026-01-31T09:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.659298 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.659344 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.659354 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.659370 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.659381 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:07Z","lastTransitionTime":"2026-01-31T09:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.762998 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.763059 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.763071 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.763091 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.763104 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:07Z","lastTransitionTime":"2026-01-31T09:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.865822 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.865891 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.865910 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.865938 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.865958 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:07Z","lastTransitionTime":"2026-01-31T09:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.969171 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.969228 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.969240 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.969260 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:07 crc kubenswrapper[4992]: I0131 09:26:07.969275 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:07Z","lastTransitionTime":"2026-01-31T09:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.071690 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.071747 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.071762 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.071781 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.071795 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:08Z","lastTransitionTime":"2026-01-31T09:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.175141 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.175188 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.175198 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.175221 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.175235 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:08Z","lastTransitionTime":"2026-01-31T09:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.282681 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.282730 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.282740 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.282757 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.282767 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:08Z","lastTransitionTime":"2026-01-31T09:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.385475 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.385552 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.385570 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.385592 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.385604 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:08Z","lastTransitionTime":"2026-01-31T09:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.393004 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 05:15:23.728266699 +0000 UTC Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.488812 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.488869 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.488882 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.488905 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.488918 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:08Z","lastTransitionTime":"2026-01-31T09:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.592260 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.592315 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.592329 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.592347 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.592363 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:08Z","lastTransitionTime":"2026-01-31T09:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.694886 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.694948 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.694967 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.694993 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.695011 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:08Z","lastTransitionTime":"2026-01-31T09:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.799037 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.799102 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.799122 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.799143 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.799158 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:08Z","lastTransitionTime":"2026-01-31T09:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.906247 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.906305 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.906320 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.906341 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:08 crc kubenswrapper[4992]: I0131 09:26:08.906356 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:08Z","lastTransitionTime":"2026-01-31T09:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.009903 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.009949 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.009960 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.009976 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.009987 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:09Z","lastTransitionTime":"2026-01-31T09:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.113535 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.113599 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.113618 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.113647 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.113667 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:09Z","lastTransitionTime":"2026-01-31T09:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.181771 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.181830 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.181889 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.182254 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:09 crc kubenswrapper[4992]: E0131 09:26:09.182238 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:09 crc kubenswrapper[4992]: E0131 09:26:09.182494 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:09 crc kubenswrapper[4992]: E0131 09:26:09.182601 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:09 crc kubenswrapper[4992]: E0131 09:26:09.182647 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.216539 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.216599 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.216616 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.216640 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.216659 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:09Z","lastTransitionTime":"2026-01-31T09:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.319721 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.319780 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.319798 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.319822 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.319839 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:09Z","lastTransitionTime":"2026-01-31T09:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.393678 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 18:22:00.369031924 +0000 UTC Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.422547 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.422584 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.422596 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.422614 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.422627 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:09Z","lastTransitionTime":"2026-01-31T09:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.526087 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.526155 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.526173 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.526198 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.526216 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:09Z","lastTransitionTime":"2026-01-31T09:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.629278 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.629332 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.629341 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.629358 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.629368 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:09Z","lastTransitionTime":"2026-01-31T09:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.732784 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.732849 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.732868 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.732899 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.732917 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:09Z","lastTransitionTime":"2026-01-31T09:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.835165 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.835206 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.835217 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.835235 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.835246 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:09Z","lastTransitionTime":"2026-01-31T09:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.938105 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.938159 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.938175 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.938197 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:09 crc kubenswrapper[4992]: I0131 09:26:09.938213 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:09Z","lastTransitionTime":"2026-01-31T09:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.041719 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.041785 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.041804 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.041825 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.041841 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:10Z","lastTransitionTime":"2026-01-31T09:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.144969 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.145017 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.145030 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.145049 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.145063 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:10Z","lastTransitionTime":"2026-01-31T09:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.183881 4992 scope.go:117] "RemoveContainer" containerID="72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156" Jan 31 09:26:10 crc kubenswrapper[4992]: E0131 09:26:10.184255 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\"" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.247627 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.247676 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.247692 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.247709 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.247721 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:10Z","lastTransitionTime":"2026-01-31T09:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.350339 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.350370 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.350380 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.350393 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.350403 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:10Z","lastTransitionTime":"2026-01-31T09:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.393987 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 23:28:55.146603817 +0000 UTC Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.453485 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.453528 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.453537 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.453555 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.453565 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:10Z","lastTransitionTime":"2026-01-31T09:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.556298 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.556349 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.556361 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.556380 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.556393 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:10Z","lastTransitionTime":"2026-01-31T09:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.659777 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.659832 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.659845 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.659866 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.659881 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:10Z","lastTransitionTime":"2026-01-31T09:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.762147 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.762185 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.762198 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.762216 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.762225 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:10Z","lastTransitionTime":"2026-01-31T09:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.864647 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.864702 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.864714 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.864735 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.864747 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:10Z","lastTransitionTime":"2026-01-31T09:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.966957 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.967009 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.967022 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.967046 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:10 crc kubenswrapper[4992]: I0131 09:26:10.967063 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:10Z","lastTransitionTime":"2026-01-31T09:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.069436 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.069478 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.069489 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.069505 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.069516 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:11Z","lastTransitionTime":"2026-01-31T09:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.171985 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.172026 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.172035 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.172051 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.172061 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:11Z","lastTransitionTime":"2026-01-31T09:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.182121 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:11 crc kubenswrapper[4992]: E0131 09:26:11.182246 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.182456 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:11 crc kubenswrapper[4992]: E0131 09:26:11.182530 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.182672 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:11 crc kubenswrapper[4992]: E0131 09:26:11.182744 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.182957 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:11 crc kubenswrapper[4992]: E0131 09:26:11.183033 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.282855 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.282909 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.282920 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.282941 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.282955 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:11Z","lastTransitionTime":"2026-01-31T09:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.385229 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.385275 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.385287 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.385305 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.385319 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:11Z","lastTransitionTime":"2026-01-31T09:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.394624 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 10:08:02.61992219 +0000 UTC Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.488310 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.488369 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.488383 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.488402 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.488449 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:11Z","lastTransitionTime":"2026-01-31T09:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.591079 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.591121 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.591131 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.591146 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.591156 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:11Z","lastTransitionTime":"2026-01-31T09:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.693546 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.693586 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.693597 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.693615 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.693627 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:11Z","lastTransitionTime":"2026-01-31T09:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.795808 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.795849 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.795861 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.795879 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.795893 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:11Z","lastTransitionTime":"2026-01-31T09:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.898066 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.898122 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.898135 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.898154 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:11 crc kubenswrapper[4992]: I0131 09:26:11.898166 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:11Z","lastTransitionTime":"2026-01-31T09:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.000927 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.000976 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.000989 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.001006 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.001016 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:12Z","lastTransitionTime":"2026-01-31T09:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.103617 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.103661 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.103674 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.103692 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.103703 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:12Z","lastTransitionTime":"2026-01-31T09:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.194938 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.207059 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.207117 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.207135 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.207154 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.207169 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:12Z","lastTransitionTime":"2026-01-31T09:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.234794 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.234856 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.234869 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.234889 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.234906 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:12Z","lastTransitionTime":"2026-01-31T09:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:12 crc kubenswrapper[4992]: E0131 09:26:12.246845 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.250340 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.250381 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.250393 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.250411 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.250441 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:12Z","lastTransitionTime":"2026-01-31T09:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:12 crc kubenswrapper[4992]: E0131 09:26:12.263885 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.268791 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.268836 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.268847 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.268866 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.268882 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:12Z","lastTransitionTime":"2026-01-31T09:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:12 crc kubenswrapper[4992]: E0131 09:26:12.281765 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.286504 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.286556 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.286568 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.286596 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.286609 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:12Z","lastTransitionTime":"2026-01-31T09:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:12 crc kubenswrapper[4992]: E0131 09:26:12.301021 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.305317 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.305370 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.305383 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.305402 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.305433 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:12Z","lastTransitionTime":"2026-01-31T09:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:12 crc kubenswrapper[4992]: E0131 09:26:12.319973 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:12 crc kubenswrapper[4992]: E0131 09:26:12.320159 4992 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.322744 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.322782 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.322797 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.322823 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.322840 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:12Z","lastTransitionTime":"2026-01-31T09:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.395796 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 09:19:32.757343236 +0000 UTC Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.426083 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.426121 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.426134 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.426151 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.426164 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:12Z","lastTransitionTime":"2026-01-31T09:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.528120 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.528157 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.528171 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.528187 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.528197 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:12Z","lastTransitionTime":"2026-01-31T09:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.630391 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.630451 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.630462 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.630479 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.630493 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:12Z","lastTransitionTime":"2026-01-31T09:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.733953 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.734018 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.734036 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.734062 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.734106 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:12Z","lastTransitionTime":"2026-01-31T09:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.841937 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.841981 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.842013 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.842033 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.842046 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:12Z","lastTransitionTime":"2026-01-31T09:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.944072 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.944109 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.944117 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.944131 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:12 crc kubenswrapper[4992]: I0131 09:26:12.944141 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:12Z","lastTransitionTime":"2026-01-31T09:26:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.046050 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.046098 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.046111 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.046130 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.046143 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:13Z","lastTransitionTime":"2026-01-31T09:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.086482 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs\") pod \"network-metrics-daemon-bplq6\" (UID: \"afb1d129-e6bb-4db2-8204-3a1f4d91048e\") " pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:13 crc kubenswrapper[4992]: E0131 09:26:13.086629 4992 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:26:13 crc kubenswrapper[4992]: E0131 09:26:13.086690 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs podName:afb1d129-e6bb-4db2-8204-3a1f4d91048e nodeName:}" failed. No retries permitted until 2026-01-31 09:26:45.086671041 +0000 UTC m=+101.058063028 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs") pod "network-metrics-daemon-bplq6" (UID: "afb1d129-e6bb-4db2-8204-3a1f4d91048e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.148666 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.148709 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.148721 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.148740 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.148756 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:13Z","lastTransitionTime":"2026-01-31T09:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.182277 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.182354 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.182276 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:13 crc kubenswrapper[4992]: E0131 09:26:13.182405 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.182450 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:13 crc kubenswrapper[4992]: E0131 09:26:13.182556 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:13 crc kubenswrapper[4992]: E0131 09:26:13.182602 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:13 crc kubenswrapper[4992]: E0131 09:26:13.182676 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.252407 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.252471 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.252484 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.252505 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.252518 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:13Z","lastTransitionTime":"2026-01-31T09:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.355253 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.355297 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.355308 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.355322 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.355333 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:13Z","lastTransitionTime":"2026-01-31T09:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.395969 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 22:15:55.146412175 +0000 UTC Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.459021 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.459094 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.459104 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.459122 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.459133 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:13Z","lastTransitionTime":"2026-01-31T09:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.562129 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.562183 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.562194 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.562211 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.562221 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:13Z","lastTransitionTime":"2026-01-31T09:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.665189 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.665259 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.665267 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.665285 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.665296 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:13Z","lastTransitionTime":"2026-01-31T09:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.768073 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.768126 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.768138 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.768159 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.768169 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:13Z","lastTransitionTime":"2026-01-31T09:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.871153 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.871196 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.871211 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.871226 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.871237 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:13Z","lastTransitionTime":"2026-01-31T09:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.973731 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.973779 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.973807 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.973834 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:13 crc kubenswrapper[4992]: I0131 09:26:13.973845 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:13Z","lastTransitionTime":"2026-01-31T09:26:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.076820 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.076885 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.076900 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.076921 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.076935 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:14Z","lastTransitionTime":"2026-01-31T09:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.180257 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.180307 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.180318 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.180338 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.180354 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:14Z","lastTransitionTime":"2026-01-31T09:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.283621 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.283932 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.284039 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.284137 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.284230 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:14Z","lastTransitionTime":"2026-01-31T09:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.386526 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.386833 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.386946 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.387038 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.387119 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:14Z","lastTransitionTime":"2026-01-31T09:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.396701 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 11:57:51.314036111 +0000 UTC Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.490256 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.490574 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.490676 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.490772 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.490847 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:14Z","lastTransitionTime":"2026-01-31T09:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.593979 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.594032 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.594041 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.594059 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.594071 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:14Z","lastTransitionTime":"2026-01-31T09:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.697114 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.697143 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.697152 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.697166 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.697175 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:14Z","lastTransitionTime":"2026-01-31T09:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.800122 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.800166 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.800176 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.800193 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.800205 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:14Z","lastTransitionTime":"2026-01-31T09:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.902947 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.902985 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.902994 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.903012 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:14 crc kubenswrapper[4992]: I0131 09:26:14.903023 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:14Z","lastTransitionTime":"2026-01-31T09:26:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.006050 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.006089 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.006101 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.006117 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.006129 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:15Z","lastTransitionTime":"2026-01-31T09:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.108967 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.109015 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.109026 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.109061 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.109073 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:15Z","lastTransitionTime":"2026-01-31T09:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.182565 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.182613 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:15 crc kubenswrapper[4992]: E0131 09:26:15.182754 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.182799 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.182839 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:15 crc kubenswrapper[4992]: E0131 09:26:15.182961 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:15 crc kubenswrapper[4992]: E0131 09:26:15.183059 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:15 crc kubenswrapper[4992]: E0131 09:26:15.183111 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.210130 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:54Z\\\",\\\"message\\\":\\\"254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236036 6642 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:25:54.236058 6642 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236241 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:25:54.236331 6642 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.211138 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.211185 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.211197 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.211216 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.211228 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:15Z","lastTransitionTime":"2026-01-31T09:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.221586 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.231709 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.240066 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.249430 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.259206 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.269915 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6268a7dc-3015-440d-aa5a-3a25b7664eee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55a9078475b0ae70324c309ff98f1d0c156f8363d66673e308dc65354001590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.282370 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.292727 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.304480 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.313205 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.313252 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.313265 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.313285 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.313329 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:15Z","lastTransitionTime":"2026-01-31T09:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.315403 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.325363 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.335307 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1785b018-a3bb-45a6-97e7-4027373f6c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02672dc43413e911b9a81cd11509509bfe92fb72dd403eb8b052cdeceffb0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d5bf660214bf3a60b1f2b2dc1be26fadbc1eca2cb41156bda68db68583f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c6368831eaa04c16890f0f8cd508e6a25399ea0d7a2f56c16b6210902207c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.347020 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.361862 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.374565 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.393990 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.396901 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 06:30:34.597058304 +0000 UTC Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.406167 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.415573 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.415629 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.415640 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.415672 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.415683 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:15Z","lastTransitionTime":"2026-01-31T09:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.418274 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.518366 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.518411 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.518438 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.518457 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.518467 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:15Z","lastTransitionTime":"2026-01-31T09:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.621036 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.621092 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.621108 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.621130 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.621145 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:15Z","lastTransitionTime":"2026-01-31T09:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.716709 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bjplh_6bd42532-8655-4c14-991b-4cc36dea52d5/kube-multus/0.log" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.716766 4992 generic.go:334] "Generic (PLEG): container finished" podID="6bd42532-8655-4c14-991b-4cc36dea52d5" containerID="29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57" exitCode=1 Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.716815 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bjplh" event={"ID":"6bd42532-8655-4c14-991b-4cc36dea52d5","Type":"ContainerDied","Data":"29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57"} Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.717499 4992 scope.go:117] "RemoveContainer" containerID="29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.727790 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.727835 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.727845 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.727864 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.727876 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:15Z","lastTransitionTime":"2026-01-31T09:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.729556 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6268a7dc-3015-440d-aa5a-3a25b7664eee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55a9078475b0ae70324c309ff98f1d0c156f8363d66673e308dc65354001590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.743103 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.759090 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.770809 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.782564 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.797214 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.810091 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1785b018-a3bb-45a6-97e7-4027373f6c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02672dc43413e911b9a81cd11509509bfe92fb72dd403eb8b052cdeceffb0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d5bf660214bf3a60b1f2b2dc1be26fadbc1eca2cb41156bda68db68583f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c6368831eaa04c16890f0f8cd508e6a25399ea0d7a2f56c16b6210902207c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.825122 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.830248 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.830295 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.830306 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.830325 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.830337 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:15Z","lastTransitionTime":"2026-01-31T09:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.842590 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.854970 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:26:15Z\\\",\\\"message\\\":\\\"2026-01-31T09:25:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4b578ae8-ad38-464b-a6ea-c6873784f1fc\\\\n2026-01-31T09:25:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4b578ae8-ad38-464b-a6ea-c6873784f1fc to /host/opt/cni/bin/\\\\n2026-01-31T09:25:30Z [verbose] multus-daemon started\\\\n2026-01-31T09:25:30Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:26:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.874685 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.887609 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.898472 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.912859 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.922485 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.932873 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.932918 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.932928 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.932944 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.932956 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:15Z","lastTransitionTime":"2026-01-31T09:26:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.937051 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.948861 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.957795 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:15 crc kubenswrapper[4992]: I0131 09:26:15.980193 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:54Z\\\",\\\"message\\\":\\\"254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236036 6642 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:25:54.236058 6642 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236241 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:25:54.236331 6642 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.036089 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.036158 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.036175 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.036205 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.036228 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:16Z","lastTransitionTime":"2026-01-31T09:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.139467 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.139511 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.139529 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.139546 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.139558 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:16Z","lastTransitionTime":"2026-01-31T09:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.242055 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.242098 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.242108 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.242124 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.242136 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:16Z","lastTransitionTime":"2026-01-31T09:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.345039 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.345086 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.345097 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.345117 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.345132 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:16Z","lastTransitionTime":"2026-01-31T09:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.397570 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 02:21:28.905059833 +0000 UTC Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.448851 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.448888 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.448898 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.448912 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.448921 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:16Z","lastTransitionTime":"2026-01-31T09:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.552389 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.552448 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.552460 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.552476 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.552487 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:16Z","lastTransitionTime":"2026-01-31T09:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.655273 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.655316 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.655329 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.655350 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.655361 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:16Z","lastTransitionTime":"2026-01-31T09:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.721641 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bjplh_6bd42532-8655-4c14-991b-4cc36dea52d5/kube-multus/0.log" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.721711 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bjplh" event={"ID":"6bd42532-8655-4c14-991b-4cc36dea52d5","Type":"ContainerStarted","Data":"8093e58dc2c6c71099d24769108b59f4c73d80c97ee5ed5e394699c3ceff3a30"} Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.735505 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.749242 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8093e58dc2c6c71099d24769108b59f4c73d80c97ee5ed5e394699c3ceff3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:26:15Z\\\",\\\"message\\\":\\\"2026-01-31T09:25:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4b578ae8-ad38-464b-a6ea-c6873784f1fc\\\\n2026-01-31T09:25:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4b578ae8-ad38-464b-a6ea-c6873784f1fc to /host/opt/cni/bin/\\\\n2026-01-31T09:25:30Z [verbose] multus-daemon started\\\\n2026-01-31T09:25:30Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:26:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.758244 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.758284 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.758294 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.758317 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.758331 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:16Z","lastTransitionTime":"2026-01-31T09:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.761133 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.776023 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1785b018-a3bb-45a6-97e7-4027373f6c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02672dc43413e911b9a81cd11509509bfe92fb72dd403eb8b052cdeceffb0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d5bf660214bf3a60b1f2b2dc1be26fadbc1eca2cb41156bda68db68583f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c6368831eaa04c16890f0f8cd508e6a25399ea0d7a2f56c16b6210902207c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.791556 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.811507 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.821708 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.842168 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.854565 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.860570 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.860681 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.860691 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.860710 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.860722 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:16Z","lastTransitionTime":"2026-01-31T09:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.874532 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.893894 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:54Z\\\",\\\"message\\\":\\\"254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236036 6642 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:25:54.236058 6642 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236241 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:25:54.236331 6642 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.905355 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.916498 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.927606 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.939174 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.950720 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.960714 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6268a7dc-3015-440d-aa5a-3a25b7664eee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55a9078475b0ae70324c309ff98f1d0c156f8363d66673e308dc65354001590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.964340 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.964431 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.964447 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.964462 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.964471 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:16Z","lastTransitionTime":"2026-01-31T09:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.972059 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:16 crc kubenswrapper[4992]: I0131 09:26:16.982540 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.072734 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.072816 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.072834 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.072873 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.072888 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:17Z","lastTransitionTime":"2026-01-31T09:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.182024 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.182094 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:17 crc kubenswrapper[4992]: E0131 09:26:17.182177 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:17 crc kubenswrapper[4992]: E0131 09:26:17.182290 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.182529 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.182555 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:17 crc kubenswrapper[4992]: E0131 09:26:17.182594 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:17 crc kubenswrapper[4992]: E0131 09:26:17.182908 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.185685 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.185718 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.185730 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.185743 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.185754 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:17Z","lastTransitionTime":"2026-01-31T09:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.290093 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.290142 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.290152 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.290172 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.290187 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:17Z","lastTransitionTime":"2026-01-31T09:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.393526 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.393576 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.393588 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.393606 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.393618 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:17Z","lastTransitionTime":"2026-01-31T09:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.398584 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 20:49:44.34266427 +0000 UTC Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.496490 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.496533 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.496544 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.496560 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.496572 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:17Z","lastTransitionTime":"2026-01-31T09:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.599296 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.599347 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.599365 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.599385 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.599398 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:17Z","lastTransitionTime":"2026-01-31T09:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.702247 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.702311 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.702323 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.702360 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.702375 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:17Z","lastTransitionTime":"2026-01-31T09:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.805956 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.806022 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.806041 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.806065 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.806085 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:17Z","lastTransitionTime":"2026-01-31T09:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.908622 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.908692 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.908713 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.908740 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:17 crc kubenswrapper[4992]: I0131 09:26:17.908758 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:17Z","lastTransitionTime":"2026-01-31T09:26:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.011800 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.011876 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.011897 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.011923 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.011940 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:18Z","lastTransitionTime":"2026-01-31T09:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.114848 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.114894 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.114905 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.114922 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.114935 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:18Z","lastTransitionTime":"2026-01-31T09:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.216947 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.216991 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.217021 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.217038 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.217049 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:18Z","lastTransitionTime":"2026-01-31T09:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.319463 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.319501 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.319509 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.319524 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.319534 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:18Z","lastTransitionTime":"2026-01-31T09:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.399032 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 13:40:40.025104225 +0000 UTC Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.422562 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.422625 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.422636 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.422655 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.422668 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:18Z","lastTransitionTime":"2026-01-31T09:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.525209 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.525258 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.525270 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.525290 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.525303 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:18Z","lastTransitionTime":"2026-01-31T09:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.628708 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.628768 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.628791 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.628818 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.628836 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:18Z","lastTransitionTime":"2026-01-31T09:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.731272 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.731338 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.731363 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.731396 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.731513 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:18Z","lastTransitionTime":"2026-01-31T09:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.834350 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.834406 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.834430 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.834450 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.834460 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:18Z","lastTransitionTime":"2026-01-31T09:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.936813 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.936869 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.936882 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.936902 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:18 crc kubenswrapper[4992]: I0131 09:26:18.936915 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:18Z","lastTransitionTime":"2026-01-31T09:26:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.040020 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.040075 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.040092 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.040115 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.040131 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:19Z","lastTransitionTime":"2026-01-31T09:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.143259 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.143303 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.143312 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.143328 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.143339 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:19Z","lastTransitionTime":"2026-01-31T09:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.182269 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.182318 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.182393 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:19 crc kubenswrapper[4992]: E0131 09:26:19.182500 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.182540 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:19 crc kubenswrapper[4992]: E0131 09:26:19.182687 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:19 crc kubenswrapper[4992]: E0131 09:26:19.182805 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:19 crc kubenswrapper[4992]: E0131 09:26:19.182876 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.246483 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.246522 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.246534 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.246553 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.246570 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:19Z","lastTransitionTime":"2026-01-31T09:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.349744 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.349791 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.349801 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.349819 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.349832 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:19Z","lastTransitionTime":"2026-01-31T09:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.399633 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 14:01:52.591227178 +0000 UTC Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.453650 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.453714 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.453730 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.453750 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.453761 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:19Z","lastTransitionTime":"2026-01-31T09:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.556903 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.556990 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.557014 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.557046 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.557065 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:19Z","lastTransitionTime":"2026-01-31T09:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.659249 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.659308 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.659322 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.659342 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.659355 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:19Z","lastTransitionTime":"2026-01-31T09:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.761929 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.762335 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.762525 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.762755 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.762943 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:19Z","lastTransitionTime":"2026-01-31T09:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.867072 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.867118 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.867130 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.867147 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.867159 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:19Z","lastTransitionTime":"2026-01-31T09:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.969840 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.970111 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.970271 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.970366 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:19 crc kubenswrapper[4992]: I0131 09:26:19.970485 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:19Z","lastTransitionTime":"2026-01-31T09:26:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.079930 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.079977 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.079988 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.080004 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.080015 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:20Z","lastTransitionTime":"2026-01-31T09:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.182639 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.182705 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.182732 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.182756 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.182776 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:20Z","lastTransitionTime":"2026-01-31T09:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.285273 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.285307 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.285317 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.285333 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.285342 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:20Z","lastTransitionTime":"2026-01-31T09:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.387995 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.388477 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.388732 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.388926 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.389086 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:20Z","lastTransitionTime":"2026-01-31T09:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.400591 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 00:21:49.821566628 +0000 UTC Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.492659 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.492726 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.492744 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.492770 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.492789 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:20Z","lastTransitionTime":"2026-01-31T09:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.596724 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.596807 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.596825 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.596849 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.596890 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:20Z","lastTransitionTime":"2026-01-31T09:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.701070 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.701119 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.701132 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.701152 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.701164 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:20Z","lastTransitionTime":"2026-01-31T09:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.803945 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.804029 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.804103 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.804130 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.804149 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:20Z","lastTransitionTime":"2026-01-31T09:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.906305 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.906448 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.906458 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.906472 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:20 crc kubenswrapper[4992]: I0131 09:26:20.906482 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:20Z","lastTransitionTime":"2026-01-31T09:26:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.008367 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.008394 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.008404 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.008433 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.008442 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:21Z","lastTransitionTime":"2026-01-31T09:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.111959 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.112017 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.112034 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.112066 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.112090 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:21Z","lastTransitionTime":"2026-01-31T09:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.181962 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.181986 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.182052 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.182101 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:21 crc kubenswrapper[4992]: E0131 09:26:21.182255 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:21 crc kubenswrapper[4992]: E0131 09:26:21.182547 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:21 crc kubenswrapper[4992]: E0131 09:26:21.182999 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:21 crc kubenswrapper[4992]: E0131 09:26:21.183134 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.183380 4992 scope.go:117] "RemoveContainer" containerID="72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.215343 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.215396 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.215456 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.215477 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.215490 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:21Z","lastTransitionTime":"2026-01-31T09:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.319373 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.319463 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.319482 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.319512 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.319535 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:21Z","lastTransitionTime":"2026-01-31T09:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.401826 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 09:16:19.659120446 +0000 UTC Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.421604 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.421648 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.421662 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.421681 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.421694 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:21Z","lastTransitionTime":"2026-01-31T09:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.524127 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.524178 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.524192 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.524214 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.524228 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:21Z","lastTransitionTime":"2026-01-31T09:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.626301 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.626342 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.626351 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.626366 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.626376 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:21Z","lastTransitionTime":"2026-01-31T09:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.728718 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.728773 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.728784 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.728803 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.728816 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:21Z","lastTransitionTime":"2026-01-31T09:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.745888 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/2.log" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.749164 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerStarted","Data":"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5"} Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.749711 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.763028 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.777696 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8093e58dc2c6c71099d24769108b59f4c73d80c97ee5ed5e394699c3ceff3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:26:15Z\\\",\\\"message\\\":\\\"2026-01-31T09:25:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4b578ae8-ad38-464b-a6ea-c6873784f1fc\\\\n2026-01-31T09:25:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4b578ae8-ad38-464b-a6ea-c6873784f1fc to /host/opt/cni/bin/\\\\n2026-01-31T09:25:30Z [verbose] multus-daemon started\\\\n2026-01-31T09:25:30Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:26:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.792502 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.807728 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1785b018-a3bb-45a6-97e7-4027373f6c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02672dc43413e911b9a81cd11509509bfe92fb72dd403eb8b052cdeceffb0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d5bf660214bf3a60b1f2b2dc1be26fadbc1eca2cb41156bda68db68583f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c6368831eaa04c16890f0f8cd508e6a25399ea0d7a2f56c16b6210902207c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.827114 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.831650 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.831708 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.831724 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.831748 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.831764 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:21Z","lastTransitionTime":"2026-01-31T09:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.845174 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.859028 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.879538 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.892606 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.906515 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.926758 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:54Z\\\",\\\"message\\\":\\\"254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236036 6642 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:25:54.236058 6642 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236241 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:25:54.236331 6642 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:26:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.933865 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.933905 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.933914 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.933932 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.933944 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:21Z","lastTransitionTime":"2026-01-31T09:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.940100 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.959683 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.972724 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.984357 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:21 crc kubenswrapper[4992]: I0131 09:26:21.997999 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.010659 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6268a7dc-3015-440d-aa5a-3a25b7664eee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55a9078475b0ae70324c309ff98f1d0c156f8363d66673e308dc65354001590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.022530 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.034910 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.036396 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.036452 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.036465 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.036481 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.036494 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:22Z","lastTransitionTime":"2026-01-31T09:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.139084 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.139134 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.139147 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.139165 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.139178 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:22Z","lastTransitionTime":"2026-01-31T09:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.242533 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.242880 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.242951 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.243022 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.243083 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:22Z","lastTransitionTime":"2026-01-31T09:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.346534 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.346641 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.346667 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.346703 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.346725 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:22Z","lastTransitionTime":"2026-01-31T09:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.402741 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 20:57:43.017432221 +0000 UTC Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.450593 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.450681 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.450694 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.450719 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.450737 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:22Z","lastTransitionTime":"2026-01-31T09:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.553484 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.553833 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.553938 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.554026 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.554113 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:22Z","lastTransitionTime":"2026-01-31T09:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.656871 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.656933 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.656952 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.656977 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.656995 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:22Z","lastTransitionTime":"2026-01-31T09:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.667367 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.667704 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.667861 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.668071 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.668263 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:22Z","lastTransitionTime":"2026-01-31T09:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:22 crc kubenswrapper[4992]: E0131 09:26:22.690302 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.697194 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.697253 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.697271 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.697295 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.697315 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:22Z","lastTransitionTime":"2026-01-31T09:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:22 crc kubenswrapper[4992]: E0131 09:26:22.720792 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.725990 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.726046 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.726072 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.726103 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.726120 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:22Z","lastTransitionTime":"2026-01-31T09:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:22 crc kubenswrapper[4992]: E0131 09:26:22.746798 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.752757 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.752817 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.752833 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.752854 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.754947 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:22Z","lastTransitionTime":"2026-01-31T09:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.761286 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/3.log" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.762264 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/2.log" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.766458 4992 generic.go:334] "Generic (PLEG): container finished" podID="6939ca32-c541-41c0-ba96-4282b942ff16" containerID="cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5" exitCode=1 Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.766519 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerDied","Data":"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5"} Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.766572 4992 scope.go:117] "RemoveContainer" containerID="72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.767876 4992 scope.go:117] "RemoveContainer" containerID="cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5" Jan 31 09:26:22 crc kubenswrapper[4992]: E0131 09:26:22.768213 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\"" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" Jan 31 09:26:22 crc kubenswrapper[4992]: E0131 09:26:22.781527 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.786766 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.786832 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.786845 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.786866 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.786882 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:22Z","lastTransitionTime":"2026-01-31T09:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.787947 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: E0131 09:26:22.806805 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: E0131 09:26:22.807037 4992 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.809390 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.809458 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.809476 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.809494 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.809507 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:22Z","lastTransitionTime":"2026-01-31T09:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.809881 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.830615 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8093e58dc2c6c71099d24769108b59f4c73d80c97ee5ed5e394699c3ceff3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:26:15Z\\\",\\\"message\\\":\\\"2026-01-31T09:25:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4b578ae8-ad38-464b-a6ea-c6873784f1fc\\\\n2026-01-31T09:25:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4b578ae8-ad38-464b-a6ea-c6873784f1fc to /host/opt/cni/bin/\\\\n2026-01-31T09:25:30Z [verbose] multus-daemon started\\\\n2026-01-31T09:25:30Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:26:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.846829 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.861971 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1785b018-a3bb-45a6-97e7-4027373f6c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02672dc43413e911b9a81cd11509509bfe92fb72dd403eb8b052cdeceffb0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d5bf660214bf3a60b1f2b2dc1be26fadbc1eca2cb41156bda68db68583f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c6368831eaa04c16890f0f8cd508e6a25399ea0d7a2f56c16b6210902207c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.879519 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.900576 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.913964 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.914214 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.914310 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.914453 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.914585 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:22Z","lastTransitionTime":"2026-01-31T09:26:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.918144 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.943725 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:22 crc kubenswrapper[4992]: I0131 09:26:22.960681 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.002106 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.017357 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.017409 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.017450 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.017474 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.017492 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:23Z","lastTransitionTime":"2026-01-31T09:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.036180 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72c8a03ae20760f4cb1a391cbfe3e2aaace1231b140590e47a3b5434d775a156\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:25:54Z\\\",\\\"message\\\":\\\"254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236036 6642 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:25:54.236058 6642 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-canary/ingress-canary]} name:Service_openshift-ingress-canary/ingress-canary_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.34:8443: 10.217.5.34:8888:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7715118b-bb1b-400a-803e-7ab2cc3eeec0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0131 09:25:54.236241 6642 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 09:25:54.236331 6642 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"netes/ovnkube-node-46cdx after 0 failed attempt(s)\\\\nI0131 09:26:22.061782 7036 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-46cdx\\\\nI0131 09:26:22.061488 7036 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-9s7nb\\\\nI0131 09:26:22.061793 7036 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-9s7nb in node crc\\\\nI0131 09:26:22.061799 7036 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-9s7nb after 0 failed attempt(s)\\\\nI0131 09:26:22.061803 7036 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-9s7nb\\\\nI0131 09:26:22.061800 7036 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI0131 09:26:22.061493 7036 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts in node crc\\\\nI0131 09:26:22.061816 7036 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts after 0 failed attempt(s)\\\\nI0131 09:26:22.061821 7036 default_network_controller.go:776] Recording success event on pod openshift\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:26:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.051673 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.068124 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.080409 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.092712 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.105515 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.117335 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6268a7dc-3015-440d-aa5a-3a25b7664eee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55a9078475b0ae70324c309ff98f1d0c156f8363d66673e308dc65354001590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.120081 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.120131 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.120142 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.120158 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.120168 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:23Z","lastTransitionTime":"2026-01-31T09:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.129545 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.182543 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.182633 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.182635 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.182543 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:23 crc kubenswrapper[4992]: E0131 09:26:23.182720 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:23 crc kubenswrapper[4992]: E0131 09:26:23.182868 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:23 crc kubenswrapper[4992]: E0131 09:26:23.182932 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:23 crc kubenswrapper[4992]: E0131 09:26:23.182994 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.223378 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.223452 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.223465 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.223483 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.223499 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:23Z","lastTransitionTime":"2026-01-31T09:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.327909 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.327974 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.327991 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.328015 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.328032 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:23Z","lastTransitionTime":"2026-01-31T09:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.403176 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 11:51:37.159130757 +0000 UTC Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.430752 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.430793 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.430824 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.430843 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.430855 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:23Z","lastTransitionTime":"2026-01-31T09:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.534524 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.534923 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.535166 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.535369 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.535575 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:23Z","lastTransitionTime":"2026-01-31T09:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.640972 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.641341 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.641512 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.641676 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.641829 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:23Z","lastTransitionTime":"2026-01-31T09:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.745409 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.746267 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.746560 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.746765 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.747030 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:23Z","lastTransitionTime":"2026-01-31T09:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.770894 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/3.log" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.775794 4992 scope.go:117] "RemoveContainer" containerID="cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5" Jan 31 09:26:23 crc kubenswrapper[4992]: E0131 09:26:23.776152 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\"" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.790488 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6268a7dc-3015-440d-aa5a-3a25b7664eee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55a9078475b0ae70324c309ff98f1d0c156f8363d66673e308dc65354001590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.803320 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.814743 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.824939 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.835782 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.847600 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.854493 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.854575 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.854615 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.854658 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.854683 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:23Z","lastTransitionTime":"2026-01-31T09:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.862845 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1785b018-a3bb-45a6-97e7-4027373f6c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02672dc43413e911b9a81cd11509509bfe92fb72dd403eb8b052cdeceffb0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d5bf660214bf3a60b1f2b2dc1be26fadbc1eca2cb41156bda68db68583f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c6368831eaa04c16890f0f8cd508e6a25399ea0d7a2f56c16b6210902207c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.876359 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.891172 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.912592 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8093e58dc2c6c71099d24769108b59f4c73d80c97ee5ed5e394699c3ceff3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:26:15Z\\\",\\\"message\\\":\\\"2026-01-31T09:25:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4b578ae8-ad38-464b-a6ea-c6873784f1fc\\\\n2026-01-31T09:25:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4b578ae8-ad38-464b-a6ea-c6873784f1fc to /host/opt/cni/bin/\\\\n2026-01-31T09:25:30Z [verbose] multus-daemon started\\\\n2026-01-31T09:25:30Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:26:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.935908 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.952913 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.956938 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.956979 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.957004 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.957022 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.957033 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:23Z","lastTransitionTime":"2026-01-31T09:26:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.969283 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:23 crc kubenswrapper[4992]: I0131 09:26:23.992821 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:23Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.009245 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.026377 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.048476 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.059687 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.059730 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.059745 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.059764 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.059778 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:24Z","lastTransitionTime":"2026-01-31T09:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.062717 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.084394 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"netes/ovnkube-node-46cdx after 0 failed attempt(s)\\\\nI0131 09:26:22.061782 7036 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-46cdx\\\\nI0131 09:26:22.061488 7036 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-9s7nb\\\\nI0131 09:26:22.061793 7036 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-9s7nb in node crc\\\\nI0131 09:26:22.061799 7036 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-9s7nb after 0 failed attempt(s)\\\\nI0131 09:26:22.061803 7036 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-9s7nb\\\\nI0131 09:26:22.061800 7036 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI0131 09:26:22.061493 7036 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts in node crc\\\\nI0131 09:26:22.061816 7036 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts after 0 failed attempt(s)\\\\nI0131 09:26:22.061821 7036 default_network_controller.go:776] Recording success event on pod openshift\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:26:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.162072 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.162133 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.162148 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.162175 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.162189 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:24Z","lastTransitionTime":"2026-01-31T09:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.265433 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.265470 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.265481 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.265497 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.265510 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:24Z","lastTransitionTime":"2026-01-31T09:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.368383 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.368519 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.368543 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.368578 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.368606 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:24Z","lastTransitionTime":"2026-01-31T09:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.404246 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 11:44:23.504363213 +0000 UTC Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.471735 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.471779 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.471794 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.471813 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.471826 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:24Z","lastTransitionTime":"2026-01-31T09:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.574949 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.574995 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.575006 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.575042 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.575059 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:24Z","lastTransitionTime":"2026-01-31T09:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.677395 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.677946 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.678063 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.678168 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.678260 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:24Z","lastTransitionTime":"2026-01-31T09:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.781215 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.781295 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.781316 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.781343 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.781359 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:24Z","lastTransitionTime":"2026-01-31T09:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.884842 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.884935 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.884964 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.884994 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.885012 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:24Z","lastTransitionTime":"2026-01-31T09:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.987641 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.987701 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.987714 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.987734 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:24 crc kubenswrapper[4992]: I0131 09:26:24.987747 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:24Z","lastTransitionTime":"2026-01-31T09:26:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.090528 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.090594 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.090611 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.090639 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.090661 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:25Z","lastTransitionTime":"2026-01-31T09:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.181753 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.181779 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.181828 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.181851 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:25 crc kubenswrapper[4992]: E0131 09:26:25.182193 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:25 crc kubenswrapper[4992]: E0131 09:26:25.182482 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:25 crc kubenswrapper[4992]: E0131 09:26:25.182642 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:25 crc kubenswrapper[4992]: E0131 09:26:25.182754 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.193967 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.194251 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.194334 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.194435 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.194577 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:25Z","lastTransitionTime":"2026-01-31T09:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.203908 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.225271 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.245192 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.270153 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"netes/ovnkube-node-46cdx after 0 failed attempt(s)\\\\nI0131 09:26:22.061782 7036 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-46cdx\\\\nI0131 09:26:22.061488 7036 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-9s7nb\\\\nI0131 09:26:22.061793 7036 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-9s7nb in node crc\\\\nI0131 09:26:22.061799 7036 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-9s7nb after 0 failed attempt(s)\\\\nI0131 09:26:22.061803 7036 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-9s7nb\\\\nI0131 09:26:22.061800 7036 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI0131 09:26:22.061493 7036 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts in node crc\\\\nI0131 09:26:22.061816 7036 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts after 0 failed attempt(s)\\\\nI0131 09:26:22.061821 7036 default_network_controller.go:776] Recording success event on pod openshift\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:26:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.283504 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6268a7dc-3015-440d-aa5a-3a25b7664eee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55a9078475b0ae70324c309ff98f1d0c156f8363d66673e308dc65354001590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.297726 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.297780 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.297789 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.297805 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.297820 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:25Z","lastTransitionTime":"2026-01-31T09:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.304680 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.324937 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.338827 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.353080 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.371255 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.386890 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1785b018-a3bb-45a6-97e7-4027373f6c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02672dc43413e911b9a81cd11509509bfe92fb72dd403eb8b052cdeceffb0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d5bf660214bf3a60b1f2b2dc1be26fadbc1eca2cb41156bda68db68583f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c6368831eaa04c16890f0f8cd508e6a25399ea0d7a2f56c16b6210902207c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.400101 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.400168 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.400181 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.400224 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.400239 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:25Z","lastTransitionTime":"2026-01-31T09:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.403522 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.404471 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 00:32:04.749767589 +0000 UTC Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.419528 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.436660 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8093e58dc2c6c71099d24769108b59f4c73d80c97ee5ed5e394699c3ceff3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:26:15Z\\\",\\\"message\\\":\\\"2026-01-31T09:25:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4b578ae8-ad38-464b-a6ea-c6873784f1fc\\\\n2026-01-31T09:25:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4b578ae8-ad38-464b-a6ea-c6873784f1fc to /host/opt/cni/bin/\\\\n2026-01-31T09:25:30Z [verbose] multus-daemon started\\\\n2026-01-31T09:25:30Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:26:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.460688 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.479850 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.498865 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.503470 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.503569 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.503594 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.503663 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.503696 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:25Z","lastTransitionTime":"2026-01-31T09:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.517056 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.531400 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:25Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.606353 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.606410 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.606452 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.606500 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.606519 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:25Z","lastTransitionTime":"2026-01-31T09:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.710122 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.710189 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.710203 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.710226 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.710243 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:25Z","lastTransitionTime":"2026-01-31T09:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.813846 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.813951 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.813974 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.814149 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.814169 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:25Z","lastTransitionTime":"2026-01-31T09:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.916669 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.916724 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.916740 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.916760 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:25 crc kubenswrapper[4992]: I0131 09:26:25.916775 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:25Z","lastTransitionTime":"2026-01-31T09:26:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.019016 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.019056 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.019067 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.019084 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.019098 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:26Z","lastTransitionTime":"2026-01-31T09:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.121579 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.121624 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.121635 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.121652 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.121666 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:26Z","lastTransitionTime":"2026-01-31T09:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.224251 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.224304 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.224317 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.224334 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.224347 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:26Z","lastTransitionTime":"2026-01-31T09:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.327805 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.327861 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.327873 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.327894 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.327909 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:26Z","lastTransitionTime":"2026-01-31T09:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.405490 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 09:13:56.110310087 +0000 UTC Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.431483 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.431554 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.431573 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.431604 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.431627 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:26Z","lastTransitionTime":"2026-01-31T09:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.534412 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.534463 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.534471 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.534485 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.534499 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:26Z","lastTransitionTime":"2026-01-31T09:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.637798 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.637847 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.637858 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.637877 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.637891 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:26Z","lastTransitionTime":"2026-01-31T09:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.740764 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.740819 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.740828 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.740845 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.740856 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:26Z","lastTransitionTime":"2026-01-31T09:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.843873 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.843916 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.843927 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.843944 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.843956 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:26Z","lastTransitionTime":"2026-01-31T09:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.951205 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.951270 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.951294 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.951332 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:26 crc kubenswrapper[4992]: I0131 09:26:26.951353 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:26Z","lastTransitionTime":"2026-01-31T09:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.054318 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.054386 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.054406 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.054469 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.054488 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:27Z","lastTransitionTime":"2026-01-31T09:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.157977 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.158084 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.158111 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.158150 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.158179 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:27Z","lastTransitionTime":"2026-01-31T09:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.182327 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.182467 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.182486 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:27 crc kubenswrapper[4992]: E0131 09:26:27.182639 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:27 crc kubenswrapper[4992]: E0131 09:26:27.182751 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.182839 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:27 crc kubenswrapper[4992]: E0131 09:26:27.182890 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:27 crc kubenswrapper[4992]: E0131 09:26:27.183098 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.261690 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.261767 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.261790 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.261819 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.261842 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:27Z","lastTransitionTime":"2026-01-31T09:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.365586 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.365642 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.365652 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.365678 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.365691 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:27Z","lastTransitionTime":"2026-01-31T09:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.406465 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 14:12:50.821931943 +0000 UTC Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.468297 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.468347 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.468356 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.468382 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.468402 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:27Z","lastTransitionTime":"2026-01-31T09:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.570514 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.570613 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.570647 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.570677 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.570696 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:27Z","lastTransitionTime":"2026-01-31T09:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.673625 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.673681 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.673699 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.673724 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.673739 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:27Z","lastTransitionTime":"2026-01-31T09:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.776485 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.776532 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.776544 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.776559 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.776572 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:27Z","lastTransitionTime":"2026-01-31T09:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.879746 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.879819 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.879840 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.879873 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.879892 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:27Z","lastTransitionTime":"2026-01-31T09:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.983077 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.983144 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.983162 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.983188 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:27 crc kubenswrapper[4992]: I0131 09:26:27.983211 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:27Z","lastTransitionTime":"2026-01-31T09:26:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.086545 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.086610 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.086627 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.086648 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.086661 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:28Z","lastTransitionTime":"2026-01-31T09:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.189949 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.190712 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.190952 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.191311 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.191489 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:28Z","lastTransitionTime":"2026-01-31T09:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.294955 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.295011 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.295022 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.295041 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.295053 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:28Z","lastTransitionTime":"2026-01-31T09:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.398343 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.398409 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.398440 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.398462 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.398476 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:28Z","lastTransitionTime":"2026-01-31T09:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.407519 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 17:31:56.751325799 +0000 UTC Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.501433 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.501468 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.501478 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.501492 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.501503 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:28Z","lastTransitionTime":"2026-01-31T09:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.604580 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.604817 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.604953 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.605039 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.605121 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:28Z","lastTransitionTime":"2026-01-31T09:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.708161 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.708238 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.708261 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.708286 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.708303 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:28Z","lastTransitionTime":"2026-01-31T09:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.811901 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.811958 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.811975 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.812001 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.812026 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:28Z","lastTransitionTime":"2026-01-31T09:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.915463 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.915511 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.915519 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.915537 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:28 crc kubenswrapper[4992]: I0131 09:26:28.915548 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:28Z","lastTransitionTime":"2026-01-31T09:26:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.019119 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.019170 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.019181 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.019201 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.019214 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:29Z","lastTransitionTime":"2026-01-31T09:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.122247 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.122310 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.122328 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.122353 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.122372 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:29Z","lastTransitionTime":"2026-01-31T09:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.182130 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.182153 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.182201 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.182243 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.182394 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.182722 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.182852 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.182936 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.226758 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.226785 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.226796 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.226810 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.226835 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:29Z","lastTransitionTime":"2026-01-31T09:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.330148 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.330193 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.330206 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.330224 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.330237 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:29Z","lastTransitionTime":"2026-01-31T09:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.408023 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 05:42:41.600655948 +0000 UTC Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.433465 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.433511 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.433522 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.433556 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.433571 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:29Z","lastTransitionTime":"2026-01-31T09:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.452579 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.452741 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.452773 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.452800 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.452823 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.452949 4992 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.453013 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.452994731 +0000 UTC m=+149.424386718 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.453212 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.453203277 +0000 UTC m=+149.424595264 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.453276 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.453295 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.453307 4992 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.453336 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.45332842 +0000 UTC m=+149.424720407 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.453477 4992 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.453500 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.453494525 +0000 UTC m=+149.424886512 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.453677 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.453725 4992 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.453738 4992 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:26:29 crc kubenswrapper[4992]: E0131 09:26:29.453816 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.453792413 +0000 UTC m=+149.425184400 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.536082 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.536141 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.536153 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.536175 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.536189 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:29Z","lastTransitionTime":"2026-01-31T09:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.639021 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.639074 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.639086 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.639106 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.639118 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:29Z","lastTransitionTime":"2026-01-31T09:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.743179 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.743250 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.743265 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.743283 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.743767 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:29Z","lastTransitionTime":"2026-01-31T09:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.846350 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.846402 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.846436 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.846456 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.846469 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:29Z","lastTransitionTime":"2026-01-31T09:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.948329 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.948368 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.948380 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.948397 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:29 crc kubenswrapper[4992]: I0131 09:26:29.948408 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:29Z","lastTransitionTime":"2026-01-31T09:26:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.050954 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.050998 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.051007 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.051023 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.051033 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:30Z","lastTransitionTime":"2026-01-31T09:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.154143 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.154183 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.154194 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.154210 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.154222 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:30Z","lastTransitionTime":"2026-01-31T09:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.257665 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.257708 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.257724 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.257750 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.257770 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:30Z","lastTransitionTime":"2026-01-31T09:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.361286 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.361361 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.361382 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.361409 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.361470 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:30Z","lastTransitionTime":"2026-01-31T09:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.409178 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 02:21:25.608749918 +0000 UTC Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.465312 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.465377 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.465396 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.465436 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.465451 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:30Z","lastTransitionTime":"2026-01-31T09:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.568122 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.568224 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.568246 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.568272 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.568291 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:30Z","lastTransitionTime":"2026-01-31T09:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.671242 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.671286 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.671297 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.671313 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.671325 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:30Z","lastTransitionTime":"2026-01-31T09:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.774186 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.774268 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.774288 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.774315 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.774334 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:30Z","lastTransitionTime":"2026-01-31T09:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.877120 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.877173 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.877183 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.877198 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.877230 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:30Z","lastTransitionTime":"2026-01-31T09:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.979925 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.979969 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.979980 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.979997 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:30 crc kubenswrapper[4992]: I0131 09:26:30.980008 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:30Z","lastTransitionTime":"2026-01-31T09:26:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.083020 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.083065 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.083077 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.083101 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.083117 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:31Z","lastTransitionTime":"2026-01-31T09:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.182458 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.182491 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:31 crc kubenswrapper[4992]: E0131 09:26:31.182651 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.182716 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.182699 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:31 crc kubenswrapper[4992]: E0131 09:26:31.183027 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:31 crc kubenswrapper[4992]: E0131 09:26:31.183141 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:31 crc kubenswrapper[4992]: E0131 09:26:31.183280 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.190483 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.190550 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.190563 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.190582 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.190594 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:31Z","lastTransitionTime":"2026-01-31T09:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.301741 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.301787 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.301797 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.301816 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.301827 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:31Z","lastTransitionTime":"2026-01-31T09:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.403889 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.403928 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.403937 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.403951 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.403960 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:31Z","lastTransitionTime":"2026-01-31T09:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.410273 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 04:24:44.165078819 +0000 UTC Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.506730 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.506824 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.506847 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.506874 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.506891 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:31Z","lastTransitionTime":"2026-01-31T09:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.609058 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.609106 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.609121 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.609140 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.609152 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:31Z","lastTransitionTime":"2026-01-31T09:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.712825 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.712869 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.712880 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.712897 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.712909 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:31Z","lastTransitionTime":"2026-01-31T09:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.815866 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.816266 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.816527 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.816754 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.816959 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:31Z","lastTransitionTime":"2026-01-31T09:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.919767 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.919804 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.919816 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.919835 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:31 crc kubenswrapper[4992]: I0131 09:26:31.919846 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:31Z","lastTransitionTime":"2026-01-31T09:26:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.022771 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.022810 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.022820 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.022835 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.022846 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:32Z","lastTransitionTime":"2026-01-31T09:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.125561 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.125627 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.125645 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.125671 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.125688 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:32Z","lastTransitionTime":"2026-01-31T09:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.228017 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.228484 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.228602 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.228654 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.228669 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:32Z","lastTransitionTime":"2026-01-31T09:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.332484 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.332545 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.332558 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.332579 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.332593 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:32Z","lastTransitionTime":"2026-01-31T09:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.411441 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 02:55:54.055049423 +0000 UTC Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.435797 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.435846 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.435858 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.435883 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.435896 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:32Z","lastTransitionTime":"2026-01-31T09:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.539788 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.539849 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.539860 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.540000 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.540023 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:32Z","lastTransitionTime":"2026-01-31T09:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.643191 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.643233 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.643244 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.643261 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.643273 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:32Z","lastTransitionTime":"2026-01-31T09:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.746092 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.746559 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.746572 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.746620 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.746670 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:32Z","lastTransitionTime":"2026-01-31T09:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.849937 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.849997 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.850017 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.850043 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.850062 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:32Z","lastTransitionTime":"2026-01-31T09:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.953098 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.953157 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.953174 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.953201 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:32 crc kubenswrapper[4992]: I0131 09:26:32.953219 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:32Z","lastTransitionTime":"2026-01-31T09:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.057213 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.057269 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.057286 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.057307 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.057320 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:33Z","lastTransitionTime":"2026-01-31T09:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.103079 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.103121 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.103132 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.103148 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.103157 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:33Z","lastTransitionTime":"2026-01-31T09:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:33 crc kubenswrapper[4992]: E0131 09:26:33.122348 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.127577 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.127623 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.127636 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.127657 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.127673 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:33Z","lastTransitionTime":"2026-01-31T09:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:33 crc kubenswrapper[4992]: E0131 09:26:33.144475 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.149878 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.150116 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.150129 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.150147 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.150157 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:33Z","lastTransitionTime":"2026-01-31T09:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:33 crc kubenswrapper[4992]: E0131 09:26:33.167269 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.172499 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.172575 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.172643 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.172695 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.172712 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:33Z","lastTransitionTime":"2026-01-31T09:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.182023 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.182105 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.182050 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.182043 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:33 crc kubenswrapper[4992]: E0131 09:26:33.182237 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:33 crc kubenswrapper[4992]: E0131 09:26:33.182367 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:33 crc kubenswrapper[4992]: E0131 09:26:33.182531 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:33 crc kubenswrapper[4992]: E0131 09:26:33.182603 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:33 crc kubenswrapper[4992]: E0131 09:26:33.192362 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.197267 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.197332 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.197347 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.197374 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.197389 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:33Z","lastTransitionTime":"2026-01-31T09:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:33 crc kubenswrapper[4992]: E0131 09:26:33.214144 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"dd4112a1-95ba-4903-8139-c099442066c8\\\",\\\"systemUUID\\\":\\\"a568f6a4-7307-4080-940d-10f688be5b04\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:33 crc kubenswrapper[4992]: E0131 09:26:33.214307 4992 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.216264 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.216324 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.216337 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.216359 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.216373 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:33Z","lastTransitionTime":"2026-01-31T09:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.319888 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.319967 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.319986 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.320015 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.320040 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:33Z","lastTransitionTime":"2026-01-31T09:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.411961 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 05:43:25.77334584 +0000 UTC Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.423915 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.423983 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.423994 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.424014 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.424028 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:33Z","lastTransitionTime":"2026-01-31T09:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.526851 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.527293 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.527566 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.527738 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.527906 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:33Z","lastTransitionTime":"2026-01-31T09:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.630977 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.631010 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.631019 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.631035 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.631046 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:33Z","lastTransitionTime":"2026-01-31T09:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.734345 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.734449 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.734481 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.734514 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.734539 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:33Z","lastTransitionTime":"2026-01-31T09:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.837625 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.837682 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.837699 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.837725 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.837747 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:33Z","lastTransitionTime":"2026-01-31T09:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.941970 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.942050 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.942076 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.942108 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:33 crc kubenswrapper[4992]: I0131 09:26:33.942133 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:33Z","lastTransitionTime":"2026-01-31T09:26:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.045220 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.045281 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.045304 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.045333 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.045363 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:34Z","lastTransitionTime":"2026-01-31T09:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.148020 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.148086 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.148104 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.148130 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.148152 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:34Z","lastTransitionTime":"2026-01-31T09:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.251517 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.251560 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.251579 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.251604 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.251617 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:34Z","lastTransitionTime":"2026-01-31T09:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.354358 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.354403 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.354414 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.354448 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.354463 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:34Z","lastTransitionTime":"2026-01-31T09:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.412944 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 22:59:09.78033869 +0000 UTC Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.456787 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.456847 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.456880 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.456902 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.456917 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:34Z","lastTransitionTime":"2026-01-31T09:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.561723 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.562130 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.562247 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.562372 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.562503 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:34Z","lastTransitionTime":"2026-01-31T09:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.666397 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.666524 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.666541 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.666562 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.666580 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:34Z","lastTransitionTime":"2026-01-31T09:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.770778 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.770854 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.770868 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.770887 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.770928 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:34Z","lastTransitionTime":"2026-01-31T09:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.875505 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.875596 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.875615 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.875651 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.875673 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:34Z","lastTransitionTime":"2026-01-31T09:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.979361 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.979499 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.979526 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.979553 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:34 crc kubenswrapper[4992]: I0131 09:26:34.979571 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:34Z","lastTransitionTime":"2026-01-31T09:26:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.082890 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.083178 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.083242 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.083305 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.083365 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:35Z","lastTransitionTime":"2026-01-31T09:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.182582 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.182603 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.182731 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:35 crc kubenswrapper[4992]: E0131 09:26:35.182885 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:35 crc kubenswrapper[4992]: E0131 09:26:35.183017 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.183046 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:35 crc kubenswrapper[4992]: E0131 09:26:35.183122 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:35 crc kubenswrapper[4992]: E0131 09:26:35.183181 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.186584 4992 scope.go:117] "RemoveContainer" containerID="cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5" Jan 31 09:26:35 crc kubenswrapper[4992]: E0131 09:26:35.187021 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\"" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.189070 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.189118 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.189140 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.189185 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.189214 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:35Z","lastTransitionTime":"2026-01-31T09:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.211258 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"377e2b66-6a04-467a-8960-241809326520\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://357f2dedd8133086fc0ed3fdbd849c329e0e188b611c6dc491f597886bcfb3f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://492c0a7e1ac7f9153252db7e7d10fe9dcecb227af786556ba9df94207980b78d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c3c29e9da7cf0ce11cd040e26663a565ded36086d3c7260a252868c8e889fa0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.233225 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1785b018-a3bb-45a6-97e7-4027373f6c6b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02672dc43413e911b9a81cd11509509bfe92fb72dd403eb8b052cdeceffb0537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b04d5bf660214bf3a60b1f2b2dc1be26fadbc1eca2cb41156bda68db68583f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c6368831eaa04c16890f0f8cd508e6a25399ea0d7a2f56c16b6210902207c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17f9b7064ab04a3c02bcff6e900b982818ec8e105334db5961446c4bdf0735e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.257060 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b08c2628-89dd-47c3-9c25-7799a63c225b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"pace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:25:09.540893 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:25:09.543274 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1895105709/tls.crt::/tmp/serving-cert-1895105709/tls.key\\\\\\\"\\\\nI0131 09:25:25.040520 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:25:25.044539 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:25:25.044556 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:25:25.044576 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:25:25.044581 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:25:25.048732 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0131 09:25:25.048750 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:25:25.048774 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048781 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:25:25.048788 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:25:25.048794 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0131 09:25:25.048800 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:25:25.048806 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:25:25.051942 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.273474 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://213132b5f2c2c76a197ce8973d389979c754dea7e7e441beaeaeb1dd9d50f03c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.291990 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.292051 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.292071 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.292134 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.292156 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:35Z","lastTransitionTime":"2026-01-31T09:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.293955 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bjplh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bd42532-8655-4c14-991b-4cc36dea52d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:26:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8093e58dc2c6c71099d24769108b59f4c73d80c97ee5ed5e394699c3ceff3a30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:26:15Z\\\",\\\"message\\\":\\\"2026-01-31T09:25:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4b578ae8-ad38-464b-a6ea-c6873784f1fc\\\\n2026-01-31T09:25:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4b578ae8-ad38-464b-a6ea-c6873784f1fc to /host/opt/cni/bin/\\\\n2026-01-31T09:25:30Z [verbose] multus-daemon started\\\\n2026-01-31T09:25:30Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:26:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:26:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bfcv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bjplh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.318070 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f703e616-6fa1-4161-ada9-06762640252c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dec9605bea9b1f780b87f59bc00fbe31e773cd7390c221a78be0b86c44fc1832\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce0d508e7906b4c76940234b08445060b9b5c6d87e3fb10683c12b7e9ef8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8602abe4d30627e10e1d6f135913bc15380f91d687ac7b47be2d0a61aca19596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4462a0595846a57a0cb93e953d24b7ce39842511aaaba8858b3da735091d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d1b493bd7bea576e3fc24d7de607bda1ad0178a66a83f50c3b5dbc877b07760\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddd122d7ece3cacda323b6f181f31a08917a59d1a272e8feb812379d04301845\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31385ec0d66c9e6a90de8e61b047c3c703fec77e6c117751e8ae11e36e166636\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24aeb686bde7053bb9d335c0b839dcb55cadd381b5f7ebe72ca99a79b8e8d3f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.335765 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.354006 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.373299 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa856ff8-dbc2-46d7-9df9-eb4320bd69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63786b459e7ae2ecf788f3f8493061052d5a04fdc0171e4e7c2abede2b55f416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5538a0c9a55fa6d7484d1cc99e62e04ef55c56109abd4572b3c1c3b7cd0a5ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2f59afad5e89e70cc61d4b8f56046a5ce6104fa1854371a1d61c2dc4c6a7080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84759cf9922ea6ac6f525006ef4a618152a1e85b3f88f811b810d6bdb2fd952\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a9ddbc570197a523190fc437ba40c2bebaaa813d4a0d80d8abe7c30cbac8331\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://891944c4caae69bfe1abc4c037fca5186fb01ce44304d0bfe7f91bbb1709a7a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e244ceb5baa2ee20220ca027d2b4dd61bd9580c92de9ea4e1c23efa0ca39ed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h8c2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9s7nb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.387903 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0d127d3-476a-4068-a55a-919fcd4b187d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2561efd577049420da30217341faf50c53b235ab39aee3fc2598662af0ff72c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://632255e2dcff9fd556958b51250d70ba7293d29625ae3c4b9cde8589d5215483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v92lv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k66ts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.394965 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.395000 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.395016 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.395060 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.395077 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:35Z","lastTransitionTime":"2026-01-31T09:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.406891 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.413307 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 04:48:40.265482735 +0000 UTC Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.422136 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c3c1a79e88acf132402fe2c2cd597d03e3735547a2fb10ea34d484262842cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c1fee1388153b7c53b330d4c8dfbb48d40d4e3cfeabf82425b7fe517a31bd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.434688 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-pt7xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9caf126d-53ac-498b-97d4-89c3c435805e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f282eeba59c51e1f894a55ecfd27072087a0167041b51d98465dc4c4abf2f278\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wcmb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-pt7xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.455971 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6939ca32-c541-41c0-ba96-4282b942ff16\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:26:22Z\\\",\\\"message\\\":\\\"netes/ovnkube-node-46cdx after 0 failed attempt(s)\\\\nI0131 09:26:22.061782 7036 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-46cdx\\\\nI0131 09:26:22.061488 7036 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-9s7nb\\\\nI0131 09:26:22.061793 7036 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-9s7nb in node crc\\\\nI0131 09:26:22.061799 7036 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-9s7nb after 0 failed attempt(s)\\\\nI0131 09:26:22.061803 7036 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-9s7nb\\\\nI0131 09:26:22.061800 7036 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}\\\\nI0131 09:26:22.061493 7036 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts in node crc\\\\nI0131 09:26:22.061816 7036 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts after 0 failed attempt(s)\\\\nI0131 09:26:22.061821 7036 default_network_controller.go:776] Recording success event on pod openshift\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:26:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dsg2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-46cdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.467908 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6268a7dc-3015-440d-aa5a-3a25b7664eee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55a9078475b0ae70324c309ff98f1d0c156f8363d66673e308dc65354001590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f9eda80a972c3d9154a59a1e2e20db288e8fe7c1e30180af89e318ec0d969b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:25:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.483871 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e0484d8e5b7f7e864c6e3b141097676c1c9090b8718b365bd6d5b0ae9adee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.498820 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.499081 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.499300 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.499523 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.499715 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:35Z","lastTransitionTime":"2026-01-31T09:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.502397 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d252d5-9d5b-422f-baee-f350df5664b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcac7d114c22dfdeea38411ec534c4f95bfa5afadd52676583455fba748744a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5qh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v7wks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.515845 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9jjrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48594b98-9b83-4c95-80a5-5655ce93a260\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e3cc946f8b8e96c6cbf1cf600e23de11a5e23ab60ec678b2e02faabc48b4a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:25:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccftx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9jjrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.526854 4992 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bplq6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"afb1d129-e6bb-4db2-8204-3a1f4d91048e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:25:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c9nkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:25:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bplq6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:26:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.603956 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.604009 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.604022 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.604041 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.604054 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:35Z","lastTransitionTime":"2026-01-31T09:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.707653 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.707692 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.707703 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.707720 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.707732 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:35Z","lastTransitionTime":"2026-01-31T09:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.810607 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.810658 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.810668 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.810686 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.810698 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:35Z","lastTransitionTime":"2026-01-31T09:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.916768 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.916857 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.916877 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.916971 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:35 crc kubenswrapper[4992]: I0131 09:26:35.917033 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:35Z","lastTransitionTime":"2026-01-31T09:26:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.022226 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.022261 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.022269 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.022283 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.022292 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:36Z","lastTransitionTime":"2026-01-31T09:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.125660 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.125706 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.125717 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.125734 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.125743 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:36Z","lastTransitionTime":"2026-01-31T09:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.228731 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.228804 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.228826 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.228857 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.228878 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:36Z","lastTransitionTime":"2026-01-31T09:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.332244 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.332299 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.332311 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.332330 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.332343 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:36Z","lastTransitionTime":"2026-01-31T09:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.414075 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 20:42:15.61794686 +0000 UTC Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.436048 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.436124 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.436144 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.436174 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.436200 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:36Z","lastTransitionTime":"2026-01-31T09:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.538918 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.538981 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.538996 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.539018 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.539035 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:36Z","lastTransitionTime":"2026-01-31T09:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.641477 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.641548 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.641566 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.641595 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.641618 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:36Z","lastTransitionTime":"2026-01-31T09:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.745094 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.745167 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.745180 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.745202 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.745215 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:36Z","lastTransitionTime":"2026-01-31T09:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.847936 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.847971 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.847981 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.848182 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.848191 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:36Z","lastTransitionTime":"2026-01-31T09:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.951572 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.951609 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.951620 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.951634 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:36 crc kubenswrapper[4992]: I0131 09:26:36.951644 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:36Z","lastTransitionTime":"2026-01-31T09:26:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.054533 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.054573 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.054581 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.054595 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.054608 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:37Z","lastTransitionTime":"2026-01-31T09:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.158110 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.158190 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.158199 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.158222 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.158233 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:37Z","lastTransitionTime":"2026-01-31T09:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.182808 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.183591 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.183614 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.183676 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:37 crc kubenswrapper[4992]: E0131 09:26:37.183734 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:37 crc kubenswrapper[4992]: E0131 09:26:37.183885 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:37 crc kubenswrapper[4992]: E0131 09:26:37.184079 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:37 crc kubenswrapper[4992]: E0131 09:26:37.184196 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.261536 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.261594 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.261616 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.261644 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.261668 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:37Z","lastTransitionTime":"2026-01-31T09:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.366110 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.366171 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.366189 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.366225 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.366251 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:37Z","lastTransitionTime":"2026-01-31T09:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.414520 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 22:48:40.049615854 +0000 UTC Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.469794 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.469972 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.469993 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.470059 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.470081 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:37Z","lastTransitionTime":"2026-01-31T09:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.574139 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.574211 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.574226 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.574246 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.574259 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:37Z","lastTransitionTime":"2026-01-31T09:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.677632 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.678020 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.678094 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.678160 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.678221 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:37Z","lastTransitionTime":"2026-01-31T09:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.782009 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.782074 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.782093 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.782121 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.782166 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:37Z","lastTransitionTime":"2026-01-31T09:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.885833 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.886305 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.886418 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.886542 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.886641 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:37Z","lastTransitionTime":"2026-01-31T09:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.990166 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.990236 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.990262 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.990294 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:37 crc kubenswrapper[4992]: I0131 09:26:37.990318 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:37Z","lastTransitionTime":"2026-01-31T09:26:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.098098 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.098161 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.098179 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.098221 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.098242 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:38Z","lastTransitionTime":"2026-01-31T09:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.208141 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.208205 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.208226 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.209003 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.209053 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:38Z","lastTransitionTime":"2026-01-31T09:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.323618 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.323705 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.323727 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.323756 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.323778 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:38Z","lastTransitionTime":"2026-01-31T09:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.414828 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 22:02:21.109216518 +0000 UTC Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.426367 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.426464 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.426486 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.426510 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.426528 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:38Z","lastTransitionTime":"2026-01-31T09:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.530874 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.530945 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.530968 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.531013 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.531045 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:38Z","lastTransitionTime":"2026-01-31T09:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.633998 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.634054 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.634066 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.634085 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.634097 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:38Z","lastTransitionTime":"2026-01-31T09:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.737797 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.737884 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.737902 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.737928 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.737945 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:38Z","lastTransitionTime":"2026-01-31T09:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.840973 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.841052 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.841084 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.841124 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.841146 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:38Z","lastTransitionTime":"2026-01-31T09:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.945615 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.945693 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.945708 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.945733 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:38 crc kubenswrapper[4992]: I0131 09:26:38.945747 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:38Z","lastTransitionTime":"2026-01-31T09:26:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.050739 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.050826 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.050850 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.050878 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.050897 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:39Z","lastTransitionTime":"2026-01-31T09:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.155856 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.155912 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.155935 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.155956 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.156008 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:39Z","lastTransitionTime":"2026-01-31T09:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.182107 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.182180 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.182384 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.182387 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:39 crc kubenswrapper[4992]: E0131 09:26:39.182538 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:39 crc kubenswrapper[4992]: E0131 09:26:39.182614 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:39 crc kubenswrapper[4992]: E0131 09:26:39.182693 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:39 crc kubenswrapper[4992]: E0131 09:26:39.182872 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.258524 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.258588 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.258604 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.258625 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.258640 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:39Z","lastTransitionTime":"2026-01-31T09:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.362200 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.362280 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.362305 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.362331 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.362350 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:39Z","lastTransitionTime":"2026-01-31T09:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.415994 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 02:08:38.454584191 +0000 UTC Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.464974 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.465027 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.465043 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.465063 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.465077 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:39Z","lastTransitionTime":"2026-01-31T09:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.568007 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.568069 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.568086 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.568108 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.568125 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:39Z","lastTransitionTime":"2026-01-31T09:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.671245 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.671291 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.671304 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.671323 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.671337 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:39Z","lastTransitionTime":"2026-01-31T09:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.774534 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.774585 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.774602 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.774622 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.774635 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:39Z","lastTransitionTime":"2026-01-31T09:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.877583 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.877638 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.877650 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.877669 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.877682 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:39Z","lastTransitionTime":"2026-01-31T09:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.979952 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.980024 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.980042 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.980071 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:39 crc kubenswrapper[4992]: I0131 09:26:39.980089 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:39Z","lastTransitionTime":"2026-01-31T09:26:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.082711 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.082759 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.082774 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.082794 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.082811 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:40Z","lastTransitionTime":"2026-01-31T09:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.185597 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.185695 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.185718 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.185752 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.185774 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:40Z","lastTransitionTime":"2026-01-31T09:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.289014 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.289070 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.289081 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.289099 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.289108 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:40Z","lastTransitionTime":"2026-01-31T09:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.397592 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.397681 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.397702 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.397729 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.397748 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:40Z","lastTransitionTime":"2026-01-31T09:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.416974 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 21:56:14.810527137 +0000 UTC Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.500792 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.500877 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.500894 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.500919 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.500936 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:40Z","lastTransitionTime":"2026-01-31T09:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.604167 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.604227 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.604244 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.604269 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.604286 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:40Z","lastTransitionTime":"2026-01-31T09:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.708020 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.708084 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.708102 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.708128 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.708147 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:40Z","lastTransitionTime":"2026-01-31T09:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.810711 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.810774 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.810792 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.810817 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.810836 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:40Z","lastTransitionTime":"2026-01-31T09:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.914511 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.914579 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.914596 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.914620 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:40 crc kubenswrapper[4992]: I0131 09:26:40.914637 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:40Z","lastTransitionTime":"2026-01-31T09:26:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.017632 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.017684 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.017700 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.017718 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.017731 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:41Z","lastTransitionTime":"2026-01-31T09:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.120487 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.120536 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.120554 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.120579 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.120595 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:41Z","lastTransitionTime":"2026-01-31T09:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.182676 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.182729 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.182882 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.183063 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:41 crc kubenswrapper[4992]: E0131 09:26:41.183051 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:41 crc kubenswrapper[4992]: E0131 09:26:41.183208 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:41 crc kubenswrapper[4992]: E0131 09:26:41.183514 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:41 crc kubenswrapper[4992]: E0131 09:26:41.183505 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.224738 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.224809 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.224830 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.224854 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.224874 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:41Z","lastTransitionTime":"2026-01-31T09:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.328080 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.328149 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.328168 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.328195 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.328214 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:41Z","lastTransitionTime":"2026-01-31T09:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.417597 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 07:02:02.369931567 +0000 UTC Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.431155 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.431213 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.431229 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.431392 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.431427 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:41Z","lastTransitionTime":"2026-01-31T09:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.534288 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.534342 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.534358 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.534376 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.534388 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:41Z","lastTransitionTime":"2026-01-31T09:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.637622 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.637695 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.637720 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.637750 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.637774 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:41Z","lastTransitionTime":"2026-01-31T09:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.741282 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.741356 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.741381 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.741412 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.741474 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:41Z","lastTransitionTime":"2026-01-31T09:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.845841 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.845906 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.845925 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.845956 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.845981 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:41Z","lastTransitionTime":"2026-01-31T09:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.948196 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.948240 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.948256 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.948276 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:41 crc kubenswrapper[4992]: I0131 09:26:41.948290 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:41Z","lastTransitionTime":"2026-01-31T09:26:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.050962 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.051004 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.051016 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.051033 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.051043 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:42Z","lastTransitionTime":"2026-01-31T09:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.153961 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.154014 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.154030 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.154051 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.154063 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:42Z","lastTransitionTime":"2026-01-31T09:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.256911 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.256984 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.257006 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.257035 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.257058 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:42Z","lastTransitionTime":"2026-01-31T09:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.360168 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.360209 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.360218 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.360233 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.360245 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:42Z","lastTransitionTime":"2026-01-31T09:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.417787 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 03:20:22.569963036 +0000 UTC Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.463317 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.463384 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.463465 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.463499 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.463521 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:42Z","lastTransitionTime":"2026-01-31T09:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.566602 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.566676 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.566699 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.566728 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.566753 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:42Z","lastTransitionTime":"2026-01-31T09:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.670094 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.670150 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.670163 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.670182 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.670197 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:42Z","lastTransitionTime":"2026-01-31T09:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.773153 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.773212 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.773234 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.773263 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.773287 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:42Z","lastTransitionTime":"2026-01-31T09:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.875881 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.875944 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.875968 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.875995 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.876017 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:42Z","lastTransitionTime":"2026-01-31T09:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.980118 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.980186 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.980202 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.980235 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:42 crc kubenswrapper[4992]: I0131 09:26:42.980251 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:42Z","lastTransitionTime":"2026-01-31T09:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.083280 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.083370 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.083387 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.083474 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.083488 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:43Z","lastTransitionTime":"2026-01-31T09:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.189517 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:43 crc kubenswrapper[4992]: E0131 09:26:43.189646 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.189961 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.190025 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.190094 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:43 crc kubenswrapper[4992]: E0131 09:26:43.190330 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:43 crc kubenswrapper[4992]: E0131 09:26:43.190607 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:43 crc kubenswrapper[4992]: E0131 09:26:43.190921 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.191836 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.191873 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.191884 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.191901 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.191914 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:43Z","lastTransitionTime":"2026-01-31T09:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.295114 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.295187 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.295210 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.295239 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.295260 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:43Z","lastTransitionTime":"2026-01-31T09:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.398206 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.398260 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.398277 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.398304 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.398321 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:43Z","lastTransitionTime":"2026-01-31T09:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.418697 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 16:01:54.084337266 +0000 UTC Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.428696 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.428761 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.428786 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.428816 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.428838 4992 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:26:43Z","lastTransitionTime":"2026-01-31T09:26:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.492933 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w"] Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.493517 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.496837 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.496948 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.497400 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.498750 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.529223 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.529198163 podStartE2EDuration="1m18.529198163s" podCreationTimestamp="2026-01-31 09:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:26:43.529027899 +0000 UTC m=+99.500419976" watchObservedRunningTime="2026-01-31 09:26:43.529198163 +0000 UTC m=+99.500590180" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.595216 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bjplh" podStartSLOduration=76.595195103 podStartE2EDuration="1m16.595195103s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:26:43.575351278 +0000 UTC m=+99.546743275" watchObservedRunningTime="2026-01-31 09:26:43.595195103 +0000 UTC m=+99.566587100" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.614360 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=78.614333507 podStartE2EDuration="1m18.614333507s" podCreationTimestamp="2026-01-31 09:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:26:43.614102581 +0000 UTC m=+99.585494578" watchObservedRunningTime="2026-01-31 09:26:43.614333507 +0000 UTC m=+99.585725494" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.621841 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e0b22fe6-2514-48cf-8fb1-d4c54a9425e1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pq98w\" (UID: \"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.621903 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0b22fe6-2514-48cf-8fb1-d4c54a9425e1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pq98w\" (UID: \"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.621928 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e0b22fe6-2514-48cf-8fb1-d4c54a9425e1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pq98w\" (UID: \"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.621952 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0b22fe6-2514-48cf-8fb1-d4c54a9425e1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pq98w\" (UID: \"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.621972 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0b22fe6-2514-48cf-8fb1-d4c54a9425e1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pq98w\" (UID: \"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.630940 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=51.6309138 podStartE2EDuration="51.6309138s" podCreationTimestamp="2026-01-31 09:25:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:26:43.630246321 +0000 UTC m=+99.601638318" watchObservedRunningTime="2026-01-31 09:26:43.6309138 +0000 UTC m=+99.602305787" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.683808 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9s7nb" podStartSLOduration=76.683786655 podStartE2EDuration="1m16.683786655s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:26:43.672401271 +0000 UTC m=+99.643793268" watchObservedRunningTime="2026-01-31 09:26:43.683786655 +0000 UTC m=+99.655178642" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.715385 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k66ts" podStartSLOduration=76.715366014 podStartE2EDuration="1m16.715366014s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:26:43.68503104 +0000 UTC m=+99.656423017" watchObservedRunningTime="2026-01-31 09:26:43.715366014 +0000 UTC m=+99.686758001" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.715863 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=76.715858108 podStartE2EDuration="1m16.715858108s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:26:43.713686916 +0000 UTC m=+99.685078903" watchObservedRunningTime="2026-01-31 09:26:43.715858108 +0000 UTC m=+99.687250095" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.723363 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e0b22fe6-2514-48cf-8fb1-d4c54a9425e1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pq98w\" (UID: \"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.723402 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0b22fe6-2514-48cf-8fb1-d4c54a9425e1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pq98w\" (UID: \"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.723482 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e0b22fe6-2514-48cf-8fb1-d4c54a9425e1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pq98w\" (UID: \"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.723512 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0b22fe6-2514-48cf-8fb1-d4c54a9425e1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pq98w\" (UID: \"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.723535 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0b22fe6-2514-48cf-8fb1-d4c54a9425e1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pq98w\" (UID: \"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.723531 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e0b22fe6-2514-48cf-8fb1-d4c54a9425e1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pq98w\" (UID: \"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.723637 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e0b22fe6-2514-48cf-8fb1-d4c54a9425e1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pq98w\" (UID: \"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.724364 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0b22fe6-2514-48cf-8fb1-d4c54a9425e1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pq98w\" (UID: \"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.729369 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0b22fe6-2514-48cf-8fb1-d4c54a9425e1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pq98w\" (UID: \"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.758241 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0b22fe6-2514-48cf-8fb1-d4c54a9425e1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pq98w\" (UID: \"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.768441 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pt7xd" podStartSLOduration=78.768403864 podStartE2EDuration="1m18.768403864s" podCreationTimestamp="2026-01-31 09:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:26:43.74122024 +0000 UTC m=+99.712612237" watchObservedRunningTime="2026-01-31 09:26:43.768403864 +0000 UTC m=+99.739795851" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.809589 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" Jan 31 09:26:43 crc kubenswrapper[4992]: W0131 09:26:43.823195 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0b22fe6_2514_48cf_8fb1_d4c54a9425e1.slice/crio-86d0bd93720ee88a8a505990327a5e36d7eb57bf127ef5957ba0128174c59855 WatchSource:0}: Error finding container 86d0bd93720ee88a8a505990327a5e36d7eb57bf127ef5957ba0128174c59855: Status 404 returned error can't find the container with id 86d0bd93720ee88a8a505990327a5e36d7eb57bf127ef5957ba0128174c59855 Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.834721 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podStartSLOduration=76.834700132 podStartE2EDuration="1m16.834700132s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:26:43.819716695 +0000 UTC m=+99.791108722" watchObservedRunningTime="2026-01-31 09:26:43.834700132 +0000 UTC m=+99.806092139" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.835407 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9jjrt" podStartSLOduration=76.835398252 podStartE2EDuration="1m16.835398252s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:26:43.834331581 +0000 UTC m=+99.805723588" watchObservedRunningTime="2026-01-31 09:26:43.835398252 +0000 UTC m=+99.806790259" Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.852856 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" event={"ID":"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1","Type":"ContainerStarted","Data":"86d0bd93720ee88a8a505990327a5e36d7eb57bf127ef5957ba0128174c59855"} Jan 31 09:26:43 crc kubenswrapper[4992]: I0131 09:26:43.858997 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=31.858978373 podStartE2EDuration="31.858978373s" podCreationTimestamp="2026-01-31 09:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:26:43.85886011 +0000 UTC m=+99.830252127" watchObservedRunningTime="2026-01-31 09:26:43.858978373 +0000 UTC m=+99.830370360" Jan 31 09:26:44 crc kubenswrapper[4992]: I0131 09:26:44.419633 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 14:47:17.021959736 +0000 UTC Jan 31 09:26:44 crc kubenswrapper[4992]: I0131 09:26:44.419714 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 31 09:26:44 crc kubenswrapper[4992]: I0131 09:26:44.432351 4992 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 31 09:26:44 crc kubenswrapper[4992]: I0131 09:26:44.857509 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" event={"ID":"e0b22fe6-2514-48cf-8fb1-d4c54a9425e1","Type":"ContainerStarted","Data":"adaca56821871987390902d730cff9446a5a6c89d8de0d49a74e96e4251067d2"} Jan 31 09:26:45 crc kubenswrapper[4992]: I0131 09:26:45.138895 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs\") pod \"network-metrics-daemon-bplq6\" (UID: \"afb1d129-e6bb-4db2-8204-3a1f4d91048e\") " pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:45 crc kubenswrapper[4992]: E0131 09:26:45.139143 4992 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:26:45 crc kubenswrapper[4992]: E0131 09:26:45.139247 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs podName:afb1d129-e6bb-4db2-8204-3a1f4d91048e nodeName:}" failed. No retries permitted until 2026-01-31 09:27:49.139220805 +0000 UTC m=+165.110612792 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs") pod "network-metrics-daemon-bplq6" (UID: "afb1d129-e6bb-4db2-8204-3a1f4d91048e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:26:45 crc kubenswrapper[4992]: I0131 09:26:45.181889 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:45 crc kubenswrapper[4992]: I0131 09:26:45.182035 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:45 crc kubenswrapper[4992]: I0131 09:26:45.182085 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:45 crc kubenswrapper[4992]: I0131 09:26:45.182119 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:45 crc kubenswrapper[4992]: E0131 09:26:45.182860 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:45 crc kubenswrapper[4992]: E0131 09:26:45.182958 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:45 crc kubenswrapper[4992]: E0131 09:26:45.183144 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:45 crc kubenswrapper[4992]: E0131 09:26:45.183235 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:47 crc kubenswrapper[4992]: I0131 09:26:47.181721 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:47 crc kubenswrapper[4992]: I0131 09:26:47.181763 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:47 crc kubenswrapper[4992]: I0131 09:26:47.181769 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:47 crc kubenswrapper[4992]: I0131 09:26:47.181859 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:47 crc kubenswrapper[4992]: E0131 09:26:47.181855 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:47 crc kubenswrapper[4992]: E0131 09:26:47.182021 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:47 crc kubenswrapper[4992]: E0131 09:26:47.182037 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:47 crc kubenswrapper[4992]: E0131 09:26:47.182184 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:49 crc kubenswrapper[4992]: I0131 09:26:49.181796 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:49 crc kubenswrapper[4992]: I0131 09:26:49.181911 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:49 crc kubenswrapper[4992]: E0131 09:26:49.181916 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:49 crc kubenswrapper[4992]: I0131 09:26:49.181796 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:49 crc kubenswrapper[4992]: I0131 09:26:49.182078 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:49 crc kubenswrapper[4992]: E0131 09:26:49.182058 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:49 crc kubenswrapper[4992]: E0131 09:26:49.182212 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:49 crc kubenswrapper[4992]: E0131 09:26:49.182287 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:50 crc kubenswrapper[4992]: I0131 09:26:50.183283 4992 scope.go:117] "RemoveContainer" containerID="cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5" Jan 31 09:26:50 crc kubenswrapper[4992]: E0131 09:26:50.183584 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-46cdx_openshift-ovn-kubernetes(6939ca32-c541-41c0-ba96-4282b942ff16)\"" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" Jan 31 09:26:51 crc kubenswrapper[4992]: I0131 09:26:51.182641 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:51 crc kubenswrapper[4992]: I0131 09:26:51.182669 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:51 crc kubenswrapper[4992]: I0131 09:26:51.182706 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:51 crc kubenswrapper[4992]: E0131 09:26:51.182764 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:51 crc kubenswrapper[4992]: I0131 09:26:51.182780 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:51 crc kubenswrapper[4992]: E0131 09:26:51.182896 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:51 crc kubenswrapper[4992]: E0131 09:26:51.182976 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:51 crc kubenswrapper[4992]: E0131 09:26:51.183041 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:53 crc kubenswrapper[4992]: I0131 09:26:53.181717 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:53 crc kubenswrapper[4992]: I0131 09:26:53.181787 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:53 crc kubenswrapper[4992]: I0131 09:26:53.181812 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:53 crc kubenswrapper[4992]: E0131 09:26:53.181882 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:53 crc kubenswrapper[4992]: E0131 09:26:53.182042 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:53 crc kubenswrapper[4992]: I0131 09:26:53.182115 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:53 crc kubenswrapper[4992]: E0131 09:26:53.182360 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:53 crc kubenswrapper[4992]: E0131 09:26:53.182585 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:55 crc kubenswrapper[4992]: I0131 09:26:55.182220 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:55 crc kubenswrapper[4992]: I0131 09:26:55.182236 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:55 crc kubenswrapper[4992]: I0131 09:26:55.182277 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:55 crc kubenswrapper[4992]: E0131 09:26:55.184276 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:55 crc kubenswrapper[4992]: I0131 09:26:55.184328 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:55 crc kubenswrapper[4992]: E0131 09:26:55.184460 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:55 crc kubenswrapper[4992]: E0131 09:26:55.184659 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:55 crc kubenswrapper[4992]: E0131 09:26:55.185217 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:57 crc kubenswrapper[4992]: I0131 09:26:57.182133 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:57 crc kubenswrapper[4992]: I0131 09:26:57.182232 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:57 crc kubenswrapper[4992]: E0131 09:26:57.182276 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:57 crc kubenswrapper[4992]: I0131 09:26:57.182306 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:57 crc kubenswrapper[4992]: I0131 09:26:57.182238 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:57 crc kubenswrapper[4992]: E0131 09:26:57.182414 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:26:57 crc kubenswrapper[4992]: E0131 09:26:57.182472 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:57 crc kubenswrapper[4992]: E0131 09:26:57.182545 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:59 crc kubenswrapper[4992]: I0131 09:26:59.181673 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:26:59 crc kubenswrapper[4992]: I0131 09:26:59.181685 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:26:59 crc kubenswrapper[4992]: E0131 09:26:59.182240 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:26:59 crc kubenswrapper[4992]: I0131 09:26:59.181805 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:26:59 crc kubenswrapper[4992]: I0131 09:26:59.181719 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:26:59 crc kubenswrapper[4992]: E0131 09:26:59.182573 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:26:59 crc kubenswrapper[4992]: E0131 09:26:59.182709 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:26:59 crc kubenswrapper[4992]: E0131 09:26:59.182821 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:27:01 crc kubenswrapper[4992]: I0131 09:27:01.181918 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:01 crc kubenswrapper[4992]: I0131 09:27:01.181930 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:27:01 crc kubenswrapper[4992]: I0131 09:27:01.182066 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:27:01 crc kubenswrapper[4992]: I0131 09:27:01.182198 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:27:01 crc kubenswrapper[4992]: E0131 09:27:01.182302 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:27:01 crc kubenswrapper[4992]: E0131 09:27:01.182495 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:27:01 crc kubenswrapper[4992]: E0131 09:27:01.182601 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:27:01 crc kubenswrapper[4992]: E0131 09:27:01.182674 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:27:01 crc kubenswrapper[4992]: I0131 09:27:01.918662 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bjplh_6bd42532-8655-4c14-991b-4cc36dea52d5/kube-multus/1.log" Jan 31 09:27:01 crc kubenswrapper[4992]: I0131 09:27:01.919192 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bjplh_6bd42532-8655-4c14-991b-4cc36dea52d5/kube-multus/0.log" Jan 31 09:27:01 crc kubenswrapper[4992]: I0131 09:27:01.919242 4992 generic.go:334] "Generic (PLEG): container finished" podID="6bd42532-8655-4c14-991b-4cc36dea52d5" containerID="8093e58dc2c6c71099d24769108b59f4c73d80c97ee5ed5e394699c3ceff3a30" exitCode=1 Jan 31 09:27:01 crc kubenswrapper[4992]: I0131 09:27:01.919285 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bjplh" event={"ID":"6bd42532-8655-4c14-991b-4cc36dea52d5","Type":"ContainerDied","Data":"8093e58dc2c6c71099d24769108b59f4c73d80c97ee5ed5e394699c3ceff3a30"} Jan 31 09:27:01 crc kubenswrapper[4992]: I0131 09:27:01.919349 4992 scope.go:117] "RemoveContainer" containerID="29d3f6ef61c68631a4e543e4fbe0a690eaa3075c8d051b501456f8f101df5c57" Jan 31 09:27:01 crc kubenswrapper[4992]: I0131 09:27:01.919932 4992 scope.go:117] "RemoveContainer" containerID="8093e58dc2c6c71099d24769108b59f4c73d80c97ee5ed5e394699c3ceff3a30" Jan 31 09:27:01 crc kubenswrapper[4992]: E0131 09:27:01.920080 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-bjplh_openshift-multus(6bd42532-8655-4c14-991b-4cc36dea52d5)\"" pod="openshift-multus/multus-bjplh" podUID="6bd42532-8655-4c14-991b-4cc36dea52d5" Jan 31 09:27:01 crc kubenswrapper[4992]: I0131 09:27:01.939995 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pq98w" podStartSLOduration=94.939979864 podStartE2EDuration="1m34.939979864s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:26:44.874131638 +0000 UTC m=+100.845523665" watchObservedRunningTime="2026-01-31 09:27:01.939979864 +0000 UTC m=+117.911371851" Jan 31 09:27:02 crc kubenswrapper[4992]: I0131 09:27:02.182117 4992 scope.go:117] "RemoveContainer" containerID="cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5" Jan 31 09:27:02 crc kubenswrapper[4992]: I0131 09:27:02.924640 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/3.log" Jan 31 09:27:02 crc kubenswrapper[4992]: I0131 09:27:02.927812 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerStarted","Data":"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6"} Jan 31 09:27:02 crc kubenswrapper[4992]: I0131 09:27:02.928329 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:27:02 crc kubenswrapper[4992]: I0131 09:27:02.929436 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bjplh_6bd42532-8655-4c14-991b-4cc36dea52d5/kube-multus/1.log" Jan 31 09:27:02 crc kubenswrapper[4992]: I0131 09:27:02.962204 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podStartSLOduration=95.962182139 podStartE2EDuration="1m35.962182139s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:02.960719437 +0000 UTC m=+118.932111444" watchObservedRunningTime="2026-01-31 09:27:02.962182139 +0000 UTC m=+118.933574146" Jan 31 09:27:02 crc kubenswrapper[4992]: I0131 09:27:02.999144 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bplq6"] Jan 31 09:27:03 crc kubenswrapper[4992]: I0131 09:27:02.999355 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:27:03 crc kubenswrapper[4992]: E0131 09:27:02.999534 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:27:03 crc kubenswrapper[4992]: I0131 09:27:03.182719 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:27:03 crc kubenswrapper[4992]: E0131 09:27:03.183100 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:27:03 crc kubenswrapper[4992]: I0131 09:27:03.182918 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:03 crc kubenswrapper[4992]: I0131 09:27:03.182933 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:27:03 crc kubenswrapper[4992]: E0131 09:27:03.183279 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:27:03 crc kubenswrapper[4992]: E0131 09:27:03.183343 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:27:05 crc kubenswrapper[4992]: E0131 09:27:05.151351 4992 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 31 09:27:05 crc kubenswrapper[4992]: I0131 09:27:05.181974 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:05 crc kubenswrapper[4992]: I0131 09:27:05.182087 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:27:05 crc kubenswrapper[4992]: E0131 09:27:05.187747 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:27:05 crc kubenswrapper[4992]: I0131 09:27:05.188136 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:27:05 crc kubenswrapper[4992]: I0131 09:27:05.188203 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:27:05 crc kubenswrapper[4992]: E0131 09:27:05.188299 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:27:05 crc kubenswrapper[4992]: E0131 09:27:05.188500 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:27:05 crc kubenswrapper[4992]: E0131 09:27:05.188653 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:27:05 crc kubenswrapper[4992]: E0131 09:27:05.285094 4992 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 09:27:07 crc kubenswrapper[4992]: I0131 09:27:07.182012 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:27:07 crc kubenswrapper[4992]: I0131 09:27:07.182260 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:07 crc kubenswrapper[4992]: I0131 09:27:07.182323 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:27:07 crc kubenswrapper[4992]: I0131 09:27:07.182317 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:27:07 crc kubenswrapper[4992]: E0131 09:27:07.182361 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:27:07 crc kubenswrapper[4992]: E0131 09:27:07.182676 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:27:07 crc kubenswrapper[4992]: E0131 09:27:07.182902 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:27:07 crc kubenswrapper[4992]: E0131 09:27:07.183299 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:27:09 crc kubenswrapper[4992]: I0131 09:27:09.182858 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:27:09 crc kubenswrapper[4992]: I0131 09:27:09.182901 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:27:09 crc kubenswrapper[4992]: I0131 09:27:09.182978 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:27:09 crc kubenswrapper[4992]: E0131 09:27:09.183041 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:27:09 crc kubenswrapper[4992]: I0131 09:27:09.183078 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:09 crc kubenswrapper[4992]: E0131 09:27:09.183233 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:27:09 crc kubenswrapper[4992]: E0131 09:27:09.183370 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:27:09 crc kubenswrapper[4992]: E0131 09:27:09.183502 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:27:10 crc kubenswrapper[4992]: E0131 09:27:10.287232 4992 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 09:27:11 crc kubenswrapper[4992]: I0131 09:27:11.181972 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:27:11 crc kubenswrapper[4992]: I0131 09:27:11.182017 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:11 crc kubenswrapper[4992]: E0131 09:27:11.182157 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:27:11 crc kubenswrapper[4992]: E0131 09:27:11.182259 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:27:11 crc kubenswrapper[4992]: I0131 09:27:11.182595 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:27:11 crc kubenswrapper[4992]: I0131 09:27:11.182695 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:27:11 crc kubenswrapper[4992]: E0131 09:27:11.182765 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:27:11 crc kubenswrapper[4992]: E0131 09:27:11.182869 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:27:13 crc kubenswrapper[4992]: I0131 09:27:13.181781 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:27:13 crc kubenswrapper[4992]: E0131 09:27:13.182239 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:27:13 crc kubenswrapper[4992]: I0131 09:27:13.181899 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:27:13 crc kubenswrapper[4992]: I0131 09:27:13.181856 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:27:13 crc kubenswrapper[4992]: E0131 09:27:13.182327 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:27:13 crc kubenswrapper[4992]: I0131 09:27:13.182032 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:13 crc kubenswrapper[4992]: E0131 09:27:13.182646 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:27:13 crc kubenswrapper[4992]: E0131 09:27:13.182730 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:27:15 crc kubenswrapper[4992]: I0131 09:27:15.181912 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:15 crc kubenswrapper[4992]: I0131 09:27:15.181912 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:27:15 crc kubenswrapper[4992]: I0131 09:27:15.181985 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:27:15 crc kubenswrapper[4992]: I0131 09:27:15.181924 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:27:15 crc kubenswrapper[4992]: E0131 09:27:15.183046 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:27:15 crc kubenswrapper[4992]: E0131 09:27:15.183214 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:27:15 crc kubenswrapper[4992]: E0131 09:27:15.183450 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:27:15 crc kubenswrapper[4992]: E0131 09:27:15.183544 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:27:15 crc kubenswrapper[4992]: I0131 09:27:15.183898 4992 scope.go:117] "RemoveContainer" containerID="8093e58dc2c6c71099d24769108b59f4c73d80c97ee5ed5e394699c3ceff3a30" Jan 31 09:27:15 crc kubenswrapper[4992]: E0131 09:27:15.288886 4992 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 09:27:15 crc kubenswrapper[4992]: I0131 09:27:15.974187 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bjplh_6bd42532-8655-4c14-991b-4cc36dea52d5/kube-multus/1.log" Jan 31 09:27:15 crc kubenswrapper[4992]: I0131 09:27:15.974244 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bjplh" event={"ID":"6bd42532-8655-4c14-991b-4cc36dea52d5","Type":"ContainerStarted","Data":"6c56799f9d42ab763c18e23603d8d02dbeae5c0bc0167fb521c17f9dd9372a8a"} Jan 31 09:27:17 crc kubenswrapper[4992]: I0131 09:27:17.182014 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:17 crc kubenswrapper[4992]: I0131 09:27:17.182049 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:27:17 crc kubenswrapper[4992]: I0131 09:27:17.182014 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:27:17 crc kubenswrapper[4992]: E0131 09:27:17.182141 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:27:17 crc kubenswrapper[4992]: I0131 09:27:17.182172 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:27:17 crc kubenswrapper[4992]: E0131 09:27:17.182240 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:27:17 crc kubenswrapper[4992]: E0131 09:27:17.182293 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:27:17 crc kubenswrapper[4992]: E0131 09:27:17.182365 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:27:19 crc kubenswrapper[4992]: I0131 09:27:19.182559 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:27:19 crc kubenswrapper[4992]: I0131 09:27:19.182603 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:27:19 crc kubenswrapper[4992]: E0131 09:27:19.182758 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:27:19 crc kubenswrapper[4992]: E0131 09:27:19.182897 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:27:19 crc kubenswrapper[4992]: I0131 09:27:19.183180 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:27:19 crc kubenswrapper[4992]: E0131 09:27:19.183317 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bplq6" podUID="afb1d129-e6bb-4db2-8204-3a1f4d91048e" Jan 31 09:27:19 crc kubenswrapper[4992]: I0131 09:27:19.183624 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:19 crc kubenswrapper[4992]: E0131 09:27:19.183791 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:27:21 crc kubenswrapper[4992]: I0131 09:27:21.181854 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:27:21 crc kubenswrapper[4992]: I0131 09:27:21.182239 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:21 crc kubenswrapper[4992]: I0131 09:27:21.182314 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:27:21 crc kubenswrapper[4992]: I0131 09:27:21.182483 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:27:21 crc kubenswrapper[4992]: I0131 09:27:21.186350 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 31 09:27:21 crc kubenswrapper[4992]: I0131 09:27:21.186448 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 31 09:27:21 crc kubenswrapper[4992]: I0131 09:27:21.186945 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 31 09:27:21 crc kubenswrapper[4992]: I0131 09:27:21.186979 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 31 09:27:21 crc kubenswrapper[4992]: I0131 09:27:21.187250 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 31 09:27:21 crc kubenswrapper[4992]: I0131 09:27:21.187328 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 31 09:27:23 crc kubenswrapper[4992]: I0131 09:27:23.820365 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.132207 4992 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.170268 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-68zwk"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.171016 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.171377 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.172014 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.175481 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.177228 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.177371 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.177440 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.177269 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.177488 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.177581 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.177598 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.178788 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.178795 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.178822 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.179066 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.179192 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.179202 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.180857 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.180964 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.181198 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.181223 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.181364 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.187889 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.188843 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.189177 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vktdq"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.190009 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.190057 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.193794 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.195277 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.196276 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.197151 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.198402 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.198470 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.199227 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-srdfh"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.199510 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.199946 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.200863 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.201687 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.214639 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.229234 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.229667 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.230537 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.230710 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.230858 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.230974 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.231079 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.231543 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.231765 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.232787 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.233303 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.233505 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.234010 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.234058 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.234115 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.234191 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.234238 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.234370 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.234655 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.234945 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.234966 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.235143 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.236111 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-7bjlw"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.236509 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.237994 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-jgrjj"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.239053 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6nssv"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.239673 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.240263 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jgrjj" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.240437 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.240614 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.241302 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.241381 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.243654 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2m8dn"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.243996 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.244247 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.244796 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.244909 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.245247 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.245457 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.247569 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.247857 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.248188 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wrmqz"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.256264 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.273153 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.273950 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.277508 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-j6dj7"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.278616 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-8vlmm"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.279154 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.279452 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.282585 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.283503 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.284403 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.284679 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-glkns"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.294619 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.295475 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.334541 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.334855 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.335006 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.335549 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.335732 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.336146 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.336687 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.336908 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.337116 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.337221 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.337333 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.337464 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.337494 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.337466 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.337600 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.337648 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.337704 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.337735 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.337817 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.337882 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.337923 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.337992 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.338091 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.338182 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.338323 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.338409 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.338542 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.338660 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.338755 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.338789 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.338876 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.338919 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.339008 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.339057 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.339123 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.339165 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.339218 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.339265 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.339355 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.339400 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.339472 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.339579 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.339678 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.339747 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.340203 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7kkdk"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.340441 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.340581 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.340722 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.340817 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.341047 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.341078 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.341172 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.341363 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.342973 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.343412 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.345435 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.345733 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-68zwk"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.345753 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.346078 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7kkdk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.346122 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tq5sk"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.346562 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.346598 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2bwrf"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.347087 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h8jxh"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.347503 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5564v"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.347984 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.348007 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-srdfh"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.348019 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.348332 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7bjlw"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.348354 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.349296 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-tq5sk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.349591 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.349782 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.350408 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h8jxh" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.352307 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.352786 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2bwrf" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.352823 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5564v" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.352955 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.360882 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.364373 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.396573 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.398854 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.401031 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/545db117-1eda-49e5-96e5-223285792b1c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-f7qlp\" (UID: \"545db117-1eda-49e5-96e5-223285792b1c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.401074 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6e0e4407-bfda-4d16-9e9e-d9065286a07d-image-import-ca\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.403328 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404078 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c397c46-5579-414a-aca9-3822b9e603ea-config\") pod \"route-controller-manager-6576b87f9c-bcggv\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404141 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e0e4407-bfda-4d16-9e9e-d9065286a07d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404169 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2g5j\" (UniqueName: \"kubernetes.io/projected/3c397c46-5579-414a-aca9-3822b9e603ea-kube-api-access-m2g5j\") pod \"route-controller-manager-6576b87f9c-bcggv\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404192 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e0e4407-bfda-4d16-9e9e-d9065286a07d-serving-cert\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404211 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c39f4282-81b2-41a4-8283-5851f4005972-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zcs9d\" (UID: \"c39f4282-81b2-41a4-8283-5851f4005972\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404236 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fd0c1366-dfbf-487e-98a4-94fb4be75045-available-featuregates\") pod \"openshift-config-operator-7777fb866f-z2gbk\" (UID: \"fd0c1366-dfbf-487e-98a4-94fb4be75045\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404334 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lmbqt\" (UID: \"21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404398 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2pnl\" (UniqueName: \"kubernetes.io/projected/6e0e4407-bfda-4d16-9e9e-d9065286a07d-kube-api-access-s2pnl\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404493 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd0c1366-dfbf-487e-98a4-94fb4be75045-serving-cert\") pod \"openshift-config-operator-7777fb866f-z2gbk\" (UID: \"fd0c1366-dfbf-487e-98a4-94fb4be75045\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404517 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6e0e4407-bfda-4d16-9e9e-d9065286a07d-etcd-serving-ca\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404563 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c397c46-5579-414a-aca9-3822b9e603ea-serving-cert\") pod \"route-controller-manager-6576b87f9c-bcggv\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404587 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/545db117-1eda-49e5-96e5-223285792b1c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-f7qlp\" (UID: \"545db117-1eda-49e5-96e5-223285792b1c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404651 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6e0e4407-bfda-4d16-9e9e-d9065286a07d-node-pullsecrets\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404686 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/545db117-1eda-49e5-96e5-223285792b1c-config\") pod \"kube-apiserver-operator-766d6c64bb-f7qlp\" (UID: \"545db117-1eda-49e5-96e5-223285792b1c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404713 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66-config\") pod \"kube-controller-manager-operator-78b949d7b-lmbqt\" (UID: \"21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404736 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6e0e4407-bfda-4d16-9e9e-d9065286a07d-audit\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404788 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e0e4407-bfda-4d16-9e9e-d9065286a07d-audit-dir\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404821 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e0e4407-bfda-4d16-9e9e-d9065286a07d-etcd-client\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404843 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c39f4282-81b2-41a4-8283-5851f4005972-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zcs9d\" (UID: \"c39f4282-81b2-41a4-8283-5851f4005972\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404873 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4np6f\" (UniqueName: \"kubernetes.io/projected/c39f4282-81b2-41a4-8283-5851f4005972-kube-api-access-4np6f\") pod \"cluster-image-registry-operator-dc59b4c8b-zcs9d\" (UID: \"c39f4282-81b2-41a4-8283-5851f4005972\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404904 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz2g9\" (UniqueName: \"kubernetes.io/projected/fd0c1366-dfbf-487e-98a4-94fb4be75045-kube-api-access-qz2g9\") pod \"openshift-config-operator-7777fb866f-z2gbk\" (UID: \"fd0c1366-dfbf-487e-98a4-94fb4be75045\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404925 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lmbqt\" (UID: \"21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.404978 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e0e4407-bfda-4d16-9e9e-d9065286a07d-config\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.405006 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6e0e4407-bfda-4d16-9e9e-d9065286a07d-encryption-config\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.405058 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c39f4282-81b2-41a4-8283-5851f4005972-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zcs9d\" (UID: \"c39f4282-81b2-41a4-8283-5851f4005972\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.405096 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c397c46-5579-414a-aca9-3822b9e603ea-client-ca\") pod \"route-controller-manager-6576b87f9c-bcggv\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.406290 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.416597 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.420514 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.420557 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.420714 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.423127 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.423280 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.424764 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.424786 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.424797 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vktdq"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.424805 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.424815 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r68rm"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.425206 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.425225 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6nssv"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.425235 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.425244 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-j6dj7"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.425286 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.425324 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.425480 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.425965 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.426459 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-glkns"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.426538 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5564v"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.426557 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-v7m78"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.426579 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.427206 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.427240 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jgrjj"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.427292 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wrmqz"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.427312 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9hqbq"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.427549 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-v7m78" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.428153 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tq5sk"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.428230 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9hqbq" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.428550 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r68rm"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.429929 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.433335 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.435276 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.439216 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2bwrf"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.444717 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.444764 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2m8dn"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.444777 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.446699 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.448118 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.449222 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.454484 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h8jxh"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.454583 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.454597 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.456607 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7kkdk"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.457048 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.458356 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.459320 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.461115 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cb5lw"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.462300 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.463707 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.465318 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-np7kv"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.465989 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-np7kv" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.468325 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-np7kv"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.470854 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cb5lw"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.472378 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-v7m78"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.472803 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.478918 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw"] Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.493547 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.505978 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e0e4407-bfda-4d16-9e9e-d9065286a07d-audit-dir\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506031 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e0e4407-bfda-4d16-9e9e-d9065286a07d-etcd-client\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506053 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c39f4282-81b2-41a4-8283-5851f4005972-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zcs9d\" (UID: \"c39f4282-81b2-41a4-8283-5851f4005972\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506055 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e0e4407-bfda-4d16-9e9e-d9065286a07d-audit-dir\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506069 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4np6f\" (UniqueName: \"kubernetes.io/projected/c39f4282-81b2-41a4-8283-5851f4005972-kube-api-access-4np6f\") pod \"cluster-image-registry-operator-dc59b4c8b-zcs9d\" (UID: \"c39f4282-81b2-41a4-8283-5851f4005972\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506143 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz2g9\" (UniqueName: \"kubernetes.io/projected/fd0c1366-dfbf-487e-98a4-94fb4be75045-kube-api-access-qz2g9\") pod \"openshift-config-operator-7777fb866f-z2gbk\" (UID: \"fd0c1366-dfbf-487e-98a4-94fb4be75045\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506171 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lmbqt\" (UID: \"21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506193 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e0e4407-bfda-4d16-9e9e-d9065286a07d-config\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506211 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6e0e4407-bfda-4d16-9e9e-d9065286a07d-encryption-config\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506230 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c39f4282-81b2-41a4-8283-5851f4005972-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zcs9d\" (UID: \"c39f4282-81b2-41a4-8283-5851f4005972\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506267 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c397c46-5579-414a-aca9-3822b9e603ea-client-ca\") pod \"route-controller-manager-6576b87f9c-bcggv\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506331 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/545db117-1eda-49e5-96e5-223285792b1c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-f7qlp\" (UID: \"545db117-1eda-49e5-96e5-223285792b1c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506359 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6e0e4407-bfda-4d16-9e9e-d9065286a07d-image-import-ca\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506381 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c397c46-5579-414a-aca9-3822b9e603ea-config\") pod \"route-controller-manager-6576b87f9c-bcggv\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506406 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e0e4407-bfda-4d16-9e9e-d9065286a07d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506549 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2g5j\" (UniqueName: \"kubernetes.io/projected/3c397c46-5579-414a-aca9-3822b9e603ea-kube-api-access-m2g5j\") pod \"route-controller-manager-6576b87f9c-bcggv\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506579 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e0e4407-bfda-4d16-9e9e-d9065286a07d-serving-cert\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506601 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c39f4282-81b2-41a4-8283-5851f4005972-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zcs9d\" (UID: \"c39f4282-81b2-41a4-8283-5851f4005972\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506627 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fd0c1366-dfbf-487e-98a4-94fb4be75045-available-featuregates\") pod \"openshift-config-operator-7777fb866f-z2gbk\" (UID: \"fd0c1366-dfbf-487e-98a4-94fb4be75045\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506655 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lmbqt\" (UID: \"21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506687 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd0c1366-dfbf-487e-98a4-94fb4be75045-serving-cert\") pod \"openshift-config-operator-7777fb866f-z2gbk\" (UID: \"fd0c1366-dfbf-487e-98a4-94fb4be75045\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506730 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6e0e4407-bfda-4d16-9e9e-d9065286a07d-etcd-serving-ca\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506754 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2pnl\" (UniqueName: \"kubernetes.io/projected/6e0e4407-bfda-4d16-9e9e-d9065286a07d-kube-api-access-s2pnl\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506773 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c397c46-5579-414a-aca9-3822b9e603ea-serving-cert\") pod \"route-controller-manager-6576b87f9c-bcggv\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506796 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/545db117-1eda-49e5-96e5-223285792b1c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-f7qlp\" (UID: \"545db117-1eda-49e5-96e5-223285792b1c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506815 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6e0e4407-bfda-4d16-9e9e-d9065286a07d-node-pullsecrets\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506837 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/545db117-1eda-49e5-96e5-223285792b1c-config\") pod \"kube-apiserver-operator-766d6c64bb-f7qlp\" (UID: \"545db117-1eda-49e5-96e5-223285792b1c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506854 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66-config\") pod \"kube-controller-manager-operator-78b949d7b-lmbqt\" (UID: \"21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.506873 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6e0e4407-bfda-4d16-9e9e-d9065286a07d-audit\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.507400 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fd0c1366-dfbf-487e-98a4-94fb4be75045-available-featuregates\") pod \"openshift-config-operator-7777fb866f-z2gbk\" (UID: \"fd0c1366-dfbf-487e-98a4-94fb4be75045\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.507449 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6e0e4407-bfda-4d16-9e9e-d9065286a07d-node-pullsecrets\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.507510 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c39f4282-81b2-41a4-8283-5851f4005972-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zcs9d\" (UID: \"c39f4282-81b2-41a4-8283-5851f4005972\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.507608 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6e0e4407-bfda-4d16-9e9e-d9065286a07d-audit\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.507989 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e0e4407-bfda-4d16-9e9e-d9065286a07d-config\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.508295 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c397c46-5579-414a-aca9-3822b9e603ea-client-ca\") pod \"route-controller-manager-6576b87f9c-bcggv\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.508844 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6e0e4407-bfda-4d16-9e9e-d9065286a07d-image-import-ca\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.509028 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e0e4407-bfda-4d16-9e9e-d9065286a07d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.509720 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6e0e4407-bfda-4d16-9e9e-d9065286a07d-etcd-serving-ca\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.510115 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c397c46-5579-414a-aca9-3822b9e603ea-config\") pod \"route-controller-manager-6576b87f9c-bcggv\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.511432 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66-config\") pod \"kube-controller-manager-operator-78b949d7b-lmbqt\" (UID: \"21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.511782 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e0e4407-bfda-4d16-9e9e-d9065286a07d-etcd-client\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.511846 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e0e4407-bfda-4d16-9e9e-d9065286a07d-serving-cert\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.512541 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd0c1366-dfbf-487e-98a4-94fb4be75045-serving-cert\") pod \"openshift-config-operator-7777fb866f-z2gbk\" (UID: \"fd0c1366-dfbf-487e-98a4-94fb4be75045\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.512827 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.512848 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c39f4282-81b2-41a4-8283-5851f4005972-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zcs9d\" (UID: \"c39f4282-81b2-41a4-8283-5851f4005972\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.514648 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c397c46-5579-414a-aca9-3822b9e603ea-serving-cert\") pod \"route-controller-manager-6576b87f9c-bcggv\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.515377 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6e0e4407-bfda-4d16-9e9e-d9065286a07d-encryption-config\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.520931 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lmbqt\" (UID: \"21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.534010 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.553023 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.572833 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.593215 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.612784 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.632547 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.653463 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.691454 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.693224 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.733349 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.753198 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.773162 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.780964 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/545db117-1eda-49e5-96e5-223285792b1c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-f7qlp\" (UID: \"545db117-1eda-49e5-96e5-223285792b1c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.793901 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.799046 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/545db117-1eda-49e5-96e5-223285792b1c-config\") pod \"kube-apiserver-operator-766d6c64bb-f7qlp\" (UID: \"545db117-1eda-49e5-96e5-223285792b1c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.814111 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.855640 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.873694 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.893561 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.913244 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.933687 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.953763 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.974155 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 31 09:27:24 crc kubenswrapper[4992]: I0131 09:27:24.993481 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.013888 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.033666 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.053033 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.073701 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.093264 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.113282 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.134221 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.153327 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.175076 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.193157 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.213758 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.232648 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.253833 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.272822 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.293245 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.314305 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.333862 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.351315 4992 request.go:700] Waited for 1.000928257s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmco-proxy-tls&limit=500&resourceVersion=0 Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.353024 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.373094 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.392810 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.412987 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.433754 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.453462 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.473266 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.493781 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.513479 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.533782 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.553489 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.573780 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.593763 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.616531 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.633908 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.654783 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.674175 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.693649 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.713133 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.733822 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.754075 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.801220 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.801413 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.821620 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.833526 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.853664 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.873836 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.893541 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.915106 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.934117 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.953616 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.974528 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 31 09:27:25 crc kubenswrapper[4992]: I0131 09:27:25.994319 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.014060 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.033439 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.054011 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.076440 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.093485 4992 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.113101 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.133892 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.153661 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.175011 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.194467 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.214459 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.263612 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz2g9\" (UniqueName: \"kubernetes.io/projected/fd0c1366-dfbf-487e-98a4-94fb4be75045-kube-api-access-qz2g9\") pod \"openshift-config-operator-7777fb866f-z2gbk\" (UID: \"fd0c1366-dfbf-487e-98a4-94fb4be75045\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.284064 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4np6f\" (UniqueName: \"kubernetes.io/projected/c39f4282-81b2-41a4-8283-5851f4005972-kube-api-access-4np6f\") pod \"cluster-image-registry-operator-dc59b4c8b-zcs9d\" (UID: \"c39f4282-81b2-41a4-8283-5851f4005972\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.288747 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2g5j\" (UniqueName: \"kubernetes.io/projected/3c397c46-5579-414a-aca9-3822b9e603ea-kube-api-access-m2g5j\") pod \"route-controller-manager-6576b87f9c-bcggv\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.320277 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/545db117-1eda-49e5-96e5-223285792b1c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-f7qlp\" (UID: \"545db117-1eda-49e5-96e5-223285792b1c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.330413 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2pnl\" (UniqueName: \"kubernetes.io/projected/6e0e4407-bfda-4d16-9e9e-d9065286a07d-kube-api-access-s2pnl\") pod \"apiserver-76f77b778f-68zwk\" (UID: \"6e0e4407-bfda-4d16-9e9e-d9065286a07d\") " pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.351971 4992 request.go:700] Waited for 1.842741152s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.352836 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lmbqt\" (UID: \"21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.362798 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.367634 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c39f4282-81b2-41a4-8283-5851f4005972-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zcs9d\" (UID: \"c39f4282-81b2-41a4-8283-5851f4005972\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.411034 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.435842 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.435886 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bfb568a-6ad7-41fd-86da-7aaf96ecd991-serving-cert\") pod \"console-operator-58897d9998-6nssv\" (UID: \"3bfb568a-6ad7-41fd-86da-7aaf96ecd991\") " pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.435909 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.435935 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e325b9b-2d5b-4fca-8344-e190bed4cdd2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wrmqz\" (UID: \"4e325b9b-2d5b-4fca-8344-e190bed4cdd2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.435955 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3bfb568a-6ad7-41fd-86da-7aaf96ecd991-trusted-ca\") pod \"console-operator-58897d9998-6nssv\" (UID: \"3bfb568a-6ad7-41fd-86da-7aaf96ecd991\") " pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436016 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047a41f9-3608-40a0-a1a2-ccdde5061412-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-v7r7n\" (UID: \"047a41f9-3608-40a0-a1a2-ccdde5061412\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436103 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-service-ca\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436160 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-oauth-serving-cert\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436197 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzrf5\" (UniqueName: \"kubernetes.io/projected/1aa0742f-dd1e-46bf-ba75-5368a621cb89-kube-api-access-nzrf5\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436222 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bps8z\" (UniqueName: \"kubernetes.io/projected/047a41f9-3608-40a0-a1a2-ccdde5061412-kube-api-access-bps8z\") pod \"openshift-controller-manager-operator-756b6f6bc6-v7r7n\" (UID: \"047a41f9-3608-40a0-a1a2-ccdde5061412\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436290 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e86d38f0-15ae-4043-a550-54cae8cf4e8d-auth-proxy-config\") pod \"machine-approver-56656f9798-zvnpr\" (UID: \"e86d38f0-15ae-4043-a550-54cae8cf4e8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436329 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436355 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-audit-dir\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436378 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aa0742f-dd1e-46bf-ba75-5368a621cb89-serving-cert\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436398 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aa0742f-dd1e-46bf-ba75-5368a621cb89-config\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436437 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvk94\" (UniqueName: \"kubernetes.io/projected/4e325b9b-2d5b-4fca-8344-e190bed4cdd2-kube-api-access-tvk94\") pod \"authentication-operator-69f744f599-wrmqz\" (UID: \"4e325b9b-2d5b-4fca-8344-e190bed4cdd2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436478 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-bound-sa-token\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436506 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86d38f0-15ae-4043-a550-54cae8cf4e8d-config\") pod \"machine-approver-56656f9798-zvnpr\" (UID: \"e86d38f0-15ae-4043-a550-54cae8cf4e8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436543 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436589 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/754c2a0a-7622-4316-9706-e8499dd756a5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: E0131 09:27:26.436662 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:26.936647268 +0000 UTC m=+142.908039255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436689 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltqsj\" (UniqueName: \"kubernetes.io/projected/eb9a2d0a-5c18-44d4-aa62-922d1937a7a4-kube-api-access-ltqsj\") pod \"cluster-samples-operator-665b6dd947-kmv44\" (UID: \"eb9a2d0a-5c18-44d4-aa62-922d1937a7a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436717 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7425a945-4499-4a87-b745-d31e5dbf9d0e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-glkns\" (UID: \"7425a945-4499-4a87-b745-d31e5dbf9d0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436748 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8npb\" (UniqueName: \"kubernetes.io/projected/b7269408-3cf0-468a-a5d4-2625ff71b408-kube-api-access-n8npb\") pod \"ingress-operator-5b745b69d9-sxddh\" (UID: \"b7269408-3cf0-468a-a5d4-2625ff71b408\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436850 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52dff\" (UniqueName: \"kubernetes.io/projected/b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6-kube-api-access-52dff\") pod \"router-default-5444994796-8vlmm\" (UID: \"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6\") " pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436872 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bfb568a-6ad7-41fd-86da-7aaf96ecd991-config\") pod \"console-operator-58897d9998-6nssv\" (UID: \"3bfb568a-6ad7-41fd-86da-7aaf96ecd991\") " pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436895 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-audit-policies\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436915 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6-default-certificate\") pod \"router-default-5444994796-8vlmm\" (UID: \"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6\") " pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436935 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-trusted-ca-bundle\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436959 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.436994 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437013 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-serving-cert\") pod \"controller-manager-879f6c89f-srdfh\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437033 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e86d38f0-15ae-4043-a550-54cae8cf4e8d-machine-approver-tls\") pod \"machine-approver-56656f9798-zvnpr\" (UID: \"e86d38f0-15ae-4043-a550-54cae8cf4e8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437058 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/754c2a0a-7622-4316-9706-e8499dd756a5-trusted-ca\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437077 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e3fff7-b9d0-4107-96ae-a0a00965f574-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bzb6n\" (UID: \"58e3fff7-b9d0-4107-96ae-a0a00965f574\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437096 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-serving-cert\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437126 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd4qw\" (UniqueName: \"kubernetes.io/projected/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-kube-api-access-cd4qw\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437146 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7269408-3cf0-468a-a5d4-2625ff71b408-trusted-ca\") pod \"ingress-operator-5b745b69d9-sxddh\" (UID: \"b7269408-3cf0-468a-a5d4-2625ff71b408\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437165 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7269408-3cf0-468a-a5d4-2625ff71b408-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sxddh\" (UID: \"b7269408-3cf0-468a-a5d4-2625ff71b408\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437197 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437248 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-registry-tls\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437275 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6-metrics-certs\") pod \"router-default-5444994796-8vlmm\" (UID: \"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6\") " pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437298 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437331 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7425a945-4499-4a87-b745-d31e5dbf9d0e-config\") pod \"machine-api-operator-5694c8668f-glkns\" (UID: \"7425a945-4499-4a87-b745-d31e5dbf9d0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437356 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxjmn\" (UniqueName: \"kubernetes.io/projected/7425a945-4499-4a87-b745-d31e5dbf9d0e-kube-api-access-rxjmn\") pod \"machine-api-operator-5694c8668f-glkns\" (UID: \"7425a945-4499-4a87-b745-d31e5dbf9d0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437376 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-client-ca\") pod \"controller-manager-879f6c89f-srdfh\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437437 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/eb9a2d0a-5c18-44d4-aa62-922d1937a7a4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kmv44\" (UID: \"eb9a2d0a-5c18-44d4-aa62-922d1937a7a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437467 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6-stats-auth\") pod \"router-default-5444994796-8vlmm\" (UID: \"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6\") " pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437488 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1aa0742f-dd1e-46bf-ba75-5368a621cb89-etcd-service-ca\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437515 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-oauth-config\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437529 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437544 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437558 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7269408-3cf0-468a-a5d4-2625ff71b408-metrics-tls\") pod \"ingress-operator-5b745b69d9-sxddh\" (UID: \"b7269408-3cf0-468a-a5d4-2625ff71b408\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437606 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1aa0742f-dd1e-46bf-ba75-5368a621cb89-etcd-ca\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437627 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-audit-policies\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437652 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db3860c3-37de-4fa5-9c79-965abd0e2149-audit-dir\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437721 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/754c2a0a-7622-4316-9706-e8499dd756a5-registry-certificates\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437753 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m47f4\" (UniqueName: \"kubernetes.io/projected/dd243542-ca16-4b95-9fa1-b579ee3cca2e-kube-api-access-m47f4\") pod \"downloads-7954f5f757-jgrjj\" (UID: \"dd243542-ca16-4b95-9fa1-b579ee3cca2e\") " pod="openshift-console/downloads-7954f5f757-jgrjj" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.437812 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439361 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439389 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xbkh\" (UniqueName: \"kubernetes.io/projected/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-kube-api-access-6xbkh\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439441 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w459n\" (UniqueName: \"kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-kube-api-access-w459n\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439462 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439485 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e325b9b-2d5b-4fca-8344-e190bed4cdd2-serving-cert\") pod \"authentication-operator-69f744f599-wrmqz\" (UID: \"4e325b9b-2d5b-4fca-8344-e190bed4cdd2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439555 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58e3fff7-b9d0-4107-96ae-a0a00965f574-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bzb6n\" (UID: \"58e3fff7-b9d0-4107-96ae-a0a00965f574\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439582 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hvtz\" (UniqueName: \"kubernetes.io/projected/3bfb568a-6ad7-41fd-86da-7aaf96ecd991-kube-api-access-8hvtz\") pod \"console-operator-58897d9998-6nssv\" (UID: \"3bfb568a-6ad7-41fd-86da-7aaf96ecd991\") " pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439602 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1aa0742f-dd1e-46bf-ba75-5368a621cb89-etcd-client\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439623 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439643 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-etcd-client\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439668 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-srdfh\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439729 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-config\") pod \"controller-manager-879f6c89f-srdfh\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439749 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/047a41f9-3608-40a0-a1a2-ccdde5061412-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-v7r7n\" (UID: \"047a41f9-3608-40a0-a1a2-ccdde5061412\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439767 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e325b9b-2d5b-4fca-8344-e190bed4cdd2-config\") pod \"authentication-operator-69f744f599-wrmqz\" (UID: \"4e325b9b-2d5b-4fca-8344-e190bed4cdd2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439788 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e325b9b-2d5b-4fca-8344-e190bed4cdd2-service-ca-bundle\") pod \"authentication-operator-69f744f599-wrmqz\" (UID: \"4e325b9b-2d5b-4fca-8344-e190bed4cdd2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439806 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-serving-cert\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439828 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/754c2a0a-7622-4316-9706-e8499dd756a5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439851 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz9sg\" (UniqueName: \"kubernetes.io/projected/e86d38f0-15ae-4043-a550-54cae8cf4e8d-kube-api-access-rz9sg\") pod \"machine-approver-56656f9798-zvnpr\" (UID: \"e86d38f0-15ae-4043-a550-54cae8cf4e8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439878 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6-service-ca-bundle\") pod \"router-default-5444994796-8vlmm\" (UID: \"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6\") " pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439916 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fxvn\" (UniqueName: \"kubernetes.io/projected/58e3fff7-b9d0-4107-96ae-a0a00965f574-kube-api-access-5fxvn\") pod \"openshift-apiserver-operator-796bbdcf4f-bzb6n\" (UID: \"58e3fff7-b9d0-4107-96ae-a0a00965f574\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.439934 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-config\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.440017 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsnbx\" (UniqueName: \"kubernetes.io/projected/db3860c3-37de-4fa5-9c79-965abd0e2149-kube-api-access-rsnbx\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.440074 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgv4k\" (UniqueName: \"kubernetes.io/projected/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-kube-api-access-qgv4k\") pod \"controller-manager-879f6c89f-srdfh\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.440185 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7425a945-4499-4a87-b745-d31e5dbf9d0e-images\") pod \"machine-api-operator-5694c8668f-glkns\" (UID: \"7425a945-4499-4a87-b745-d31e5dbf9d0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.440224 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-encryption-config\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.505149 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541555 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541714 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d-signing-key\") pod \"service-ca-9c57cc56f-v7m78\" (UID: \"9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d\") " pod="openshift-service-ca/service-ca-9c57cc56f-v7m78" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541738 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br8xx\" (UniqueName: \"kubernetes.io/projected/3f216997-1de4-499d-b5f2-0bacbbdbdd36-kube-api-access-br8xx\") pod \"migrator-59844c95c7-5564v\" (UID: \"3f216997-1de4-499d-b5f2-0bacbbdbdd36\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5564v" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541766 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/754c2a0a-7622-4316-9706-e8499dd756a5-registry-certificates\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541785 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpf6w\" (UniqueName: \"kubernetes.io/projected/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-kube-api-access-qpf6w\") pod \"marketplace-operator-79b997595-r68rm\" (UID: \"42351b07-cf74-49fd-b6fd-88b7ef8fdac0\") " pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541811 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m47f4\" (UniqueName: \"kubernetes.io/projected/dd243542-ca16-4b95-9fa1-b579ee3cca2e-kube-api-access-m47f4\") pod \"downloads-7954f5f757-jgrjj\" (UID: \"dd243542-ca16-4b95-9fa1-b579ee3cca2e\") " pod="openshift-console/downloads-7954f5f757-jgrjj" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541826 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541841 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541861 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e325b9b-2d5b-4fca-8344-e190bed4cdd2-serving-cert\") pod \"authentication-operator-69f744f599-wrmqz\" (UID: \"4e325b9b-2d5b-4fca-8344-e190bed4cdd2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541878 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r68rm\" (UID: \"42351b07-cf74-49fd-b6fd-88b7ef8fdac0\") " pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541896 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhw87\" (UniqueName: \"kubernetes.io/projected/5eb9b7a9-c9b5-4b31-9773-dee9b786ba91-kube-api-access-nhw87\") pod \"machine-config-server-9hqbq\" (UID: \"5eb9b7a9-c9b5-4b31-9773-dee9b786ba91\") " pod="openshift-machine-config-operator/machine-config-server-9hqbq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541926 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f8fe86e-1b21-4358-aa22-4c0939d313f7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45rxt\" (UID: \"4f8fe86e-1b21-4358-aa22-4c0939d313f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541944 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1aa0742f-dd1e-46bf-ba75-5368a621cb89-etcd-client\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541960 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541977 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f4ab691e-904d-49b1-9b3e-57a8271bd791-images\") pod \"machine-config-operator-74547568cd-8r6k8\" (UID: \"f4ab691e-904d-49b1-9b3e-57a8271bd791\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.541999 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-srdfh\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542014 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/82abaea9-9d21-432e-a434-d21fe6a7197b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-25wn6\" (UID: \"82abaea9-9d21-432e-a434-d21fe6a7197b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542039 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-config\") pod \"controller-manager-879f6c89f-srdfh\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542054 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/047a41f9-3608-40a0-a1a2-ccdde5061412-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-v7r7n\" (UID: \"047a41f9-3608-40a0-a1a2-ccdde5061412\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542070 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thkjg\" (UniqueName: \"kubernetes.io/projected/a364be49-d097-438f-858d-77e2bcff5ad0-kube-api-access-thkjg\") pod \"multus-admission-controller-857f4d67dd-7kkdk\" (UID: \"a364be49-d097-438f-858d-77e2bcff5ad0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7kkdk" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542089 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz9sg\" (UniqueName: \"kubernetes.io/projected/e86d38f0-15ae-4043-a550-54cae8cf4e8d-kube-api-access-rz9sg\") pod \"machine-approver-56656f9798-zvnpr\" (UID: \"e86d38f0-15ae-4043-a550-54cae8cf4e8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542106 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f8fe86e-1b21-4358-aa22-4c0939d313f7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45rxt\" (UID: \"4f8fe86e-1b21-4358-aa22-4c0939d313f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542133 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/754c2a0a-7622-4316-9706-e8499dd756a5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542150 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6-service-ca-bundle\") pod \"router-default-5444994796-8vlmm\" (UID: \"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6\") " pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542174 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-config\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542190 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/82abaea9-9d21-432e-a434-d21fe6a7197b-srv-cert\") pod \"olm-operator-6b444d44fb-25wn6\" (UID: \"82abaea9-9d21-432e-a434-d21fe6a7197b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542207 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/24d2a857-eb20-4eb7-acb2-077e53af8b03-plugins-dir\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542230 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7425a945-4499-4a87-b745-d31e5dbf9d0e-images\") pod \"machine-api-operator-5694c8668f-glkns\" (UID: \"7425a945-4499-4a87-b745-d31e5dbf9d0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542258 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/24d2a857-eb20-4eb7-acb2-077e53af8b03-socket-dir\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542275 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/24d2a857-eb20-4eb7-acb2-077e53af8b03-registration-dir\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542295 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542312 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4ab691e-904d-49b1-9b3e-57a8271bd791-proxy-tls\") pod \"machine-config-operator-74547568cd-8r6k8\" (UID: \"f4ab691e-904d-49b1-9b3e-57a8271bd791\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542329 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5eb9b7a9-c9b5-4b31-9773-dee9b786ba91-node-bootstrap-token\") pod \"machine-config-server-9hqbq\" (UID: \"5eb9b7a9-c9b5-4b31-9773-dee9b786ba91\") " pod="openshift-machine-config-operator/machine-config-server-9hqbq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542346 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65c59658-5ed8-4cef-b36d-2a1e44ec6976-config-volume\") pod \"collect-profiles-29497515-tqk6n\" (UID: \"65c59658-5ed8-4cef-b36d-2a1e44ec6976\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542361 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/24d2a857-eb20-4eb7-acb2-077e53af8b03-csi-data-dir\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542377 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047a41f9-3608-40a0-a1a2-ccdde5061412-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-v7r7n\" (UID: \"047a41f9-3608-40a0-a1a2-ccdde5061412\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542393 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5eb9b7a9-c9b5-4b31-9773-dee9b786ba91-certs\") pod \"machine-config-server-9hqbq\" (UID: \"5eb9b7a9-c9b5-4b31-9773-dee9b786ba91\") " pod="openshift-machine-config-operator/machine-config-server-9hqbq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542412 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-audit-dir\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542454 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aa0742f-dd1e-46bf-ba75-5368a621cb89-serving-cert\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542469 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1993d67-8009-4a0c-90b2-517e1504bc2a-config-volume\") pod \"dns-default-2bwrf\" (UID: \"c1993d67-8009-4a0c-90b2-517e1504bc2a\") " pod="openshift-dns/dns-default-2bwrf" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542486 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86d38f0-15ae-4043-a550-54cae8cf4e8d-config\") pod \"machine-approver-56656f9798-zvnpr\" (UID: \"e86d38f0-15ae-4043-a550-54cae8cf4e8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542503 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/754c2a0a-7622-4316-9706-e8499dd756a5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542522 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltqsj\" (UniqueName: \"kubernetes.io/projected/eb9a2d0a-5c18-44d4-aa62-922d1937a7a4-kube-api-access-ltqsj\") pod \"cluster-samples-operator-665b6dd947-kmv44\" (UID: \"eb9a2d0a-5c18-44d4-aa62-922d1937a7a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542539 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7425a945-4499-4a87-b745-d31e5dbf9d0e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-glkns\" (UID: \"7425a945-4499-4a87-b745-d31e5dbf9d0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542556 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52dff\" (UniqueName: \"kubernetes.io/projected/b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6-kube-api-access-52dff\") pod \"router-default-5444994796-8vlmm\" (UID: \"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6\") " pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542572 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-audit-policies\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542592 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qql4s\" (UniqueName: \"kubernetes.io/projected/46b9e525-ef69-4600-92c7-8eb418824669-kube-api-access-qql4s\") pod \"machine-config-controller-84d6567774-dntz7\" (UID: \"46b9e525-ef69-4600-92c7-8eb418824669\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542608 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-trusted-ca-bundle\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542623 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542639 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-serving-cert\") pod \"controller-manager-879f6c89f-srdfh\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542672 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e86d38f0-15ae-4043-a550-54cae8cf4e8d-machine-approver-tls\") pod \"machine-approver-56656f9798-zvnpr\" (UID: \"e86d38f0-15ae-4043-a550-54cae8cf4e8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542706 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/46b9e525-ef69-4600-92c7-8eb418824669-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dntz7\" (UID: \"46b9e525-ef69-4600-92c7-8eb418824669\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542737 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6-default-certificate\") pod \"router-default-5444994796-8vlmm\" (UID: \"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6\") " pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542759 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4ab691e-904d-49b1-9b3e-57a8271bd791-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8r6k8\" (UID: \"f4ab691e-904d-49b1-9b3e-57a8271bd791\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542776 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/24d2a857-eb20-4eb7-acb2-077e53af8b03-mountpoint-dir\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542794 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpc5j\" (UniqueName: \"kubernetes.io/projected/c1993d67-8009-4a0c-90b2-517e1504bc2a-kube-api-access-hpc5j\") pod \"dns-default-2bwrf\" (UID: \"c1993d67-8009-4a0c-90b2-517e1504bc2a\") " pod="openshift-dns/dns-default-2bwrf" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542810 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-serving-cert\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542826 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e3fff7-b9d0-4107-96ae-a0a00965f574-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bzb6n\" (UID: \"58e3fff7-b9d0-4107-96ae-a0a00965f574\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542844 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d-signing-cabundle\") pod \"service-ca-9c57cc56f-v7m78\" (UID: \"9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d\") " pod="openshift-service-ca/service-ca-9c57cc56f-v7m78" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542860 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/65c59658-5ed8-4cef-b36d-2a1e44ec6976-secret-volume\") pod \"collect-profiles-29497515-tqk6n\" (UID: \"65c59658-5ed8-4cef-b36d-2a1e44ec6976\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542922 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542942 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-registry-tls\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542959 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6-metrics-certs\") pod \"router-default-5444994796-8vlmm\" (UID: \"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6\") " pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542974 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.542995 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7425a945-4499-4a87-b745-d31e5dbf9d0e-config\") pod \"machine-api-operator-5694c8668f-glkns\" (UID: \"7425a945-4499-4a87-b745-d31e5dbf9d0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543012 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543026 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7269408-3cf0-468a-a5d4-2625ff71b408-metrics-tls\") pod \"ingress-operator-5b745b69d9-sxddh\" (UID: \"b7269408-3cf0-468a-a5d4-2625ff71b408\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543045 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9398e230-5d76-4418-9807-be17513913c0-profile-collector-cert\") pod \"catalog-operator-68c6474976-45qvp\" (UID: \"9398e230-5d76-4418-9807-be17513913c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543105 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/eb9a2d0a-5c18-44d4-aa62-922d1937a7a4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kmv44\" (UID: \"eb9a2d0a-5c18-44d4-aa62-922d1937a7a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543152 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-oauth-config\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543179 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db3860c3-37de-4fa5-9c79-965abd0e2149-audit-dir\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543206 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1aa0742f-dd1e-46bf-ba75-5368a621cb89-etcd-ca\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543226 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-audit-policies\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543250 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xbkh\" (UniqueName: \"kubernetes.io/projected/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-kube-api-access-6xbkh\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543274 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543304 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w459n\" (UniqueName: \"kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-kube-api-access-w459n\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543337 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58e3fff7-b9d0-4107-96ae-a0a00965f574-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bzb6n\" (UID: \"58e3fff7-b9d0-4107-96ae-a0a00965f574\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543359 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hvtz\" (UniqueName: \"kubernetes.io/projected/3bfb568a-6ad7-41fd-86da-7aaf96ecd991-kube-api-access-8hvtz\") pod \"console-operator-58897d9998-6nssv\" (UID: \"3bfb568a-6ad7-41fd-86da-7aaf96ecd991\") " pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543380 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-etcd-client\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543406 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fbfb9fcb-64ca-4640-a230-4212004d2494-apiservice-cert\") pod \"packageserver-d55dfcdfc-kc7bw\" (UID: \"fbfb9fcb-64ca-4640-a230-4212004d2494\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543454 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qh79\" (UniqueName: \"kubernetes.io/projected/9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d-kube-api-access-2qh79\") pod \"service-ca-9c57cc56f-v7m78\" (UID: \"9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d\") " pod="openshift-service-ca/service-ca-9c57cc56f-v7m78" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543486 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24b61f27-cb4d-4611-a0e8-14f618385f83-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ddxxb\" (UID: \"24b61f27-cb4d-4611-a0e8-14f618385f83\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543510 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7edcde3-ebb8-4a50-a75f-28539482b78b-cert\") pod \"ingress-canary-np7kv\" (UID: \"e7edcde3-ebb8-4a50-a75f-28539482b78b\") " pod="openshift-ingress-canary/ingress-canary-np7kv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543552 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e325b9b-2d5b-4fca-8344-e190bed4cdd2-config\") pod \"authentication-operator-69f744f599-wrmqz\" (UID: \"4e325b9b-2d5b-4fca-8344-e190bed4cdd2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543573 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e325b9b-2d5b-4fca-8344-e190bed4cdd2-service-ca-bundle\") pod \"authentication-operator-69f744f599-wrmqz\" (UID: \"4e325b9b-2d5b-4fca-8344-e190bed4cdd2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543593 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-serving-cert\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543618 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkldh\" (UniqueName: \"kubernetes.io/projected/0a88cb09-555e-4fd5-9ffc-fcff02f2bf35-kube-api-access-nkldh\") pod \"dns-operator-744455d44c-tq5sk\" (UID: \"0a88cb09-555e-4fd5-9ffc-fcff02f2bf35\") " pod="openshift-dns-operator/dns-operator-744455d44c-tq5sk" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543641 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fbfb9fcb-64ca-4640-a230-4212004d2494-tmpfs\") pod \"packageserver-d55dfcdfc-kc7bw\" (UID: \"fbfb9fcb-64ca-4640-a230-4212004d2494\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543667 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r68rm\" (UID: \"42351b07-cf74-49fd-b6fd-88b7ef8fdac0\") " pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543689 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/46b9e525-ef69-4600-92c7-8eb418824669-proxy-tls\") pod \"machine-config-controller-84d6567774-dntz7\" (UID: \"46b9e525-ef69-4600-92c7-8eb418824669\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543723 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fxvn\" (UniqueName: \"kubernetes.io/projected/58e3fff7-b9d0-4107-96ae-a0a00965f574-kube-api-access-5fxvn\") pod \"openshift-apiserver-operator-796bbdcf4f-bzb6n\" (UID: \"58e3fff7-b9d0-4107-96ae-a0a00965f574\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543744 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsnbx\" (UniqueName: \"kubernetes.io/projected/db3860c3-37de-4fa5-9c79-965abd0e2149-kube-api-access-rsnbx\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543778 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgv4k\" (UniqueName: \"kubernetes.io/projected/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-kube-api-access-qgv4k\") pod \"controller-manager-879f6c89f-srdfh\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543806 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9398e230-5d76-4418-9807-be17513913c0-srv-cert\") pod \"catalog-operator-68c6474976-45qvp\" (UID: \"9398e230-5d76-4418-9807-be17513913c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543829 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnx7p\" (UniqueName: \"kubernetes.io/projected/90220a85-3f5b-4360-9cc9-d9c9a65db928-kube-api-access-tnx7p\") pod \"package-server-manager-789f6589d5-b99x8\" (UID: \"90220a85-3f5b-4360-9cc9-d9c9a65db928\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.543913 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-encryption-config\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.544157 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7425a945-4499-4a87-b745-d31e5dbf9d0e-images\") pod \"machine-api-operator-5694c8668f-glkns\" (UID: \"7425a945-4499-4a87-b745-d31e5dbf9d0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" Jan 31 09:27:26 crc kubenswrapper[4992]: E0131 09:27:26.544251 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.044228081 +0000 UTC m=+143.015620068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.544617 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwbdk\" (UniqueName: \"kubernetes.io/projected/9fb8fd57-5826-40cd-b62d-2a53e9e0c72c-kube-api-access-jwbdk\") pod \"control-plane-machine-set-operator-78cbb6b69f-h8jxh\" (UID: \"9fb8fd57-5826-40cd-b62d-2a53e9e0c72c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h8jxh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.544678 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j45l\" (UniqueName: \"kubernetes.io/projected/9398e230-5d76-4418-9807-be17513913c0-kube-api-access-4j45l\") pod \"catalog-operator-68c6474976-45qvp\" (UID: \"9398e230-5d76-4418-9807-be17513913c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.544724 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e325b9b-2d5b-4fca-8344-e190bed4cdd2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wrmqz\" (UID: \"4e325b9b-2d5b-4fca-8344-e190bed4cdd2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.544779 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5z97\" (UniqueName: \"kubernetes.io/projected/24b61f27-cb4d-4611-a0e8-14f618385f83-kube-api-access-l5z97\") pod \"kube-storage-version-migrator-operator-b67b599dd-ddxxb\" (UID: \"24b61f27-cb4d-4611-a0e8-14f618385f83\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.544846 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f8fe86e-1b21-4358-aa22-4c0939d313f7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45rxt\" (UID: \"4f8fe86e-1b21-4358-aa22-4c0939d313f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.544893 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bfb568a-6ad7-41fd-86da-7aaf96ecd991-serving-cert\") pod \"console-operator-58897d9998-6nssv\" (UID: \"3bfb568a-6ad7-41fd-86da-7aaf96ecd991\") " pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.544933 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.544983 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3bfb568a-6ad7-41fd-86da-7aaf96ecd991-trusted-ca\") pod \"console-operator-58897d9998-6nssv\" (UID: \"3bfb568a-6ad7-41fd-86da-7aaf96ecd991\") " pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545035 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a924bba-57c2-4c3b-9560-ab10bed041cf-serving-cert\") pod \"service-ca-operator-777779d784-jr8pm\" (UID: \"3a924bba-57c2-4c3b-9560-ab10bed041cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545028 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/754c2a0a-7622-4316-9706-e8499dd756a5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545090 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-service-ca\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545150 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-oauth-serving-cert\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545188 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e325b9b-2d5b-4fca-8344-e190bed4cdd2-config\") pod \"authentication-operator-69f744f599-wrmqz\" (UID: \"4e325b9b-2d5b-4fca-8344-e190bed4cdd2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545205 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e86d38f0-15ae-4043-a550-54cae8cf4e8d-auth-proxy-config\") pod \"machine-approver-56656f9798-zvnpr\" (UID: \"e86d38f0-15ae-4043-a550-54cae8cf4e8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545261 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzrf5\" (UniqueName: \"kubernetes.io/projected/1aa0742f-dd1e-46bf-ba75-5368a621cb89-kube-api-access-nzrf5\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545308 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bps8z\" (UniqueName: \"kubernetes.io/projected/047a41f9-3608-40a0-a1a2-ccdde5061412-kube-api-access-bps8z\") pod \"openshift-controller-manager-operator-756b6f6bc6-v7r7n\" (UID: \"047a41f9-3608-40a0-a1a2-ccdde5061412\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545362 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545414 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrbst\" (UniqueName: \"kubernetes.io/projected/f4ab691e-904d-49b1-9b3e-57a8271bd791-kube-api-access-hrbst\") pod \"machine-config-operator-74547568cd-8r6k8\" (UID: \"f4ab691e-904d-49b1-9b3e-57a8271bd791\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545478 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/90220a85-3f5b-4360-9cc9-d9c9a65db928-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b99x8\" (UID: \"90220a85-3f5b-4360-9cc9-d9c9a65db928\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545534 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvk94\" (UniqueName: \"kubernetes.io/projected/4e325b9b-2d5b-4fca-8344-e190bed4cdd2-kube-api-access-tvk94\") pod \"authentication-operator-69f744f599-wrmqz\" (UID: \"4e325b9b-2d5b-4fca-8344-e190bed4cdd2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545581 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fbfb9fcb-64ca-4640-a230-4212004d2494-webhook-cert\") pod \"packageserver-d55dfcdfc-kc7bw\" (UID: \"fbfb9fcb-64ca-4640-a230-4212004d2494\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545616 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhnsb\" (UniqueName: \"kubernetes.io/projected/65c59658-5ed8-4cef-b36d-2a1e44ec6976-kube-api-access-rhnsb\") pod \"collect-profiles-29497515-tqk6n\" (UID: \"65c59658-5ed8-4cef-b36d-2a1e44ec6976\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545657 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e325b9b-2d5b-4fca-8344-e190bed4cdd2-service-ca-bundle\") pod \"authentication-operator-69f744f599-wrmqz\" (UID: \"4e325b9b-2d5b-4fca-8344-e190bed4cdd2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545676 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aa0742f-dd1e-46bf-ba75-5368a621cb89-config\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545711 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-bound-sa-token\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545744 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545778 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24b61f27-cb4d-4611-a0e8-14f618385f83-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ddxxb\" (UID: \"24b61f27-cb4d-4611-a0e8-14f618385f83\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545845 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8npb\" (UniqueName: \"kubernetes.io/projected/b7269408-3cf0-468a-a5d4-2625ff71b408-kube-api-access-n8npb\") pod \"ingress-operator-5b745b69d9-sxddh\" (UID: \"b7269408-3cf0-468a-a5d4-2625ff71b408\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545902 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g2f6\" (UniqueName: \"kubernetes.io/projected/82abaea9-9d21-432e-a434-d21fe6a7197b-kube-api-access-8g2f6\") pod \"olm-operator-6b444d44fb-25wn6\" (UID: \"82abaea9-9d21-432e-a434-d21fe6a7197b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545956 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bfb568a-6ad7-41fd-86da-7aaf96ecd991-config\") pod \"console-operator-58897d9998-6nssv\" (UID: \"3bfb568a-6ad7-41fd-86da-7aaf96ecd991\") " pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.545992 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9fb8fd57-5826-40cd-b62d-2a53e9e0c72c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h8jxh\" (UID: \"9fb8fd57-5826-40cd-b62d-2a53e9e0c72c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h8jxh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.546012 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047a41f9-3608-40a0-a1a2-ccdde5061412-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-v7r7n\" (UID: \"047a41f9-3608-40a0-a1a2-ccdde5061412\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.546030 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a88cb09-555e-4fd5-9ffc-fcff02f2bf35-metrics-tls\") pod \"dns-operator-744455d44c-tq5sk\" (UID: \"0a88cb09-555e-4fd5-9ffc-fcff02f2bf35\") " pod="openshift-dns-operator/dns-operator-744455d44c-tq5sk" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.546076 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.546113 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6mtl\" (UniqueName: \"kubernetes.io/projected/fbfb9fcb-64ca-4640-a230-4212004d2494-kube-api-access-n6mtl\") pod \"packageserver-d55dfcdfc-kc7bw\" (UID: \"fbfb9fcb-64ca-4640-a230-4212004d2494\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.546144 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c1993d67-8009-4a0c-90b2-517e1504bc2a-metrics-tls\") pod \"dns-default-2bwrf\" (UID: \"c1993d67-8009-4a0c-90b2-517e1504bc2a\") " pod="openshift-dns/dns-default-2bwrf" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.546177 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a924bba-57c2-4c3b-9560-ab10bed041cf-config\") pod \"service-ca-operator-777779d784-jr8pm\" (UID: \"3a924bba-57c2-4c3b-9560-ab10bed041cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.546215 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/754c2a0a-7622-4316-9706-e8499dd756a5-trusted-ca\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.546274 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd4qw\" (UniqueName: \"kubernetes.io/projected/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-kube-api-access-cd4qw\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.546329 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7269408-3cf0-468a-a5d4-2625ff71b408-trusted-ca\") pod \"ingress-operator-5b745b69d9-sxddh\" (UID: \"b7269408-3cf0-468a-a5d4-2625ff71b408\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.546388 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7269408-3cf0-468a-a5d4-2625ff71b408-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sxddh\" (UID: \"b7269408-3cf0-468a-a5d4-2625ff71b408\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.546542 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6-service-ca-bundle\") pod \"router-default-5444994796-8vlmm\" (UID: \"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6\") " pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.546728 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-config\") pod \"controller-manager-879f6c89f-srdfh\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.546749 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a364be49-d097-438f-858d-77e2bcff5ad0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7kkdk\" (UID: \"a364be49-d097-438f-858d-77e2bcff5ad0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7kkdk" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.546794 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tksxk\" (UniqueName: \"kubernetes.io/projected/3a924bba-57c2-4c3b-9560-ab10bed041cf-kube-api-access-tksxk\") pod \"service-ca-operator-777779d784-jr8pm\" (UID: \"3a924bba-57c2-4c3b-9560-ab10bed041cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.546833 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdwx8\" (UniqueName: \"kubernetes.io/projected/e7edcde3-ebb8-4a50-a75f-28539482b78b-kube-api-access-qdwx8\") pod \"ingress-canary-np7kv\" (UID: \"e7edcde3-ebb8-4a50-a75f-28539482b78b\") " pod="openshift-ingress-canary/ingress-canary-np7kv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.547146 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-client-ca\") pod \"controller-manager-879f6c89f-srdfh\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.547203 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxjmn\" (UniqueName: \"kubernetes.io/projected/7425a945-4499-4a87-b745-d31e5dbf9d0e-kube-api-access-rxjmn\") pod \"machine-api-operator-5694c8668f-glkns\" (UID: \"7425a945-4499-4a87-b745-d31e5dbf9d0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.547236 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1aa0742f-dd1e-46bf-ba75-5368a621cb89-etcd-service-ca\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.547270 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.547305 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrrlf\" (UniqueName: \"kubernetes.io/projected/24d2a857-eb20-4eb7-acb2-077e53af8b03-kube-api-access-qrrlf\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.547367 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6-stats-auth\") pod \"router-default-5444994796-8vlmm\" (UID: \"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6\") " pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.547767 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-config\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.548115 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e325b9b-2d5b-4fca-8344-e190bed4cdd2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wrmqz\" (UID: \"4e325b9b-2d5b-4fca-8344-e190bed4cdd2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.549116 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e86d38f0-15ae-4043-a550-54cae8cf4e8d-auth-proxy-config\") pod \"machine-approver-56656f9798-zvnpr\" (UID: \"e86d38f0-15ae-4043-a550-54cae8cf4e8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" Jan 31 09:27:26 crc kubenswrapper[4992]: E0131 09:27:26.549751 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.049728123 +0000 UTC m=+143.021120130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.549816 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e3fff7-b9d0-4107-96ae-a0a00965f574-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bzb6n\" (UID: \"58e3fff7-b9d0-4107-96ae-a0a00965f574\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.549860 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.549852 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-audit-policies\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.549908 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-audit-dir\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.550371 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/754c2a0a-7622-4316-9706-e8499dd756a5-registry-certificates\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.551112 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3bfb568a-6ad7-41fd-86da-7aaf96ecd991-trusted-ca\") pod \"console-operator-58897d9998-6nssv\" (UID: \"3bfb568a-6ad7-41fd-86da-7aaf96ecd991\") " pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.552118 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-service-ca\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.552208 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86d38f0-15ae-4043-a550-54cae8cf4e8d-config\") pod \"machine-approver-56656f9798-zvnpr\" (UID: \"e86d38f0-15ae-4043-a550-54cae8cf4e8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.552252 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bfb568a-6ad7-41fd-86da-7aaf96ecd991-config\") pod \"console-operator-58897d9998-6nssv\" (UID: \"3bfb568a-6ad7-41fd-86da-7aaf96ecd991\") " pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.552465 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1aa0742f-dd1e-46bf-ba75-5368a621cb89-etcd-ca\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.552516 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db3860c3-37de-4fa5-9c79-965abd0e2149-audit-dir\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.552541 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.552989 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.553601 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1aa0742f-dd1e-46bf-ba75-5368a621cb89-etcd-service-ca\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.553950 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-srdfh\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.554066 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1aa0742f-dd1e-46bf-ba75-5368a621cb89-config\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.554509 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.555616 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-serving-cert\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.555936 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-trusted-ca-bundle\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.557472 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7425a945-4499-4a87-b745-d31e5dbf9d0e-config\") pod \"machine-api-operator-5694c8668f-glkns\" (UID: \"7425a945-4499-4a87-b745-d31e5dbf9d0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.558455 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-oauth-serving-cert\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.559389 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1aa0742f-dd1e-46bf-ba75-5368a621cb89-serving-cert\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.560301 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-etcd-client\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.560916 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-client-ca\") pod \"controller-manager-879f6c89f-srdfh\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.561880 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.567114 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/754c2a0a-7622-4316-9706-e8499dd756a5-trusted-ca\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.567220 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-oauth-config\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.567654 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-audit-policies\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.567837 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.568227 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.569231 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.571083 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.573828 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.575709 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e86d38f0-15ae-4043-a550-54cae8cf4e8d-machine-approver-tls\") pod \"machine-approver-56656f9798-zvnpr\" (UID: \"e86d38f0-15ae-4043-a550-54cae8cf4e8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.575909 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-serving-cert\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.575991 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1aa0742f-dd1e-46bf-ba75-5368a621cb89-etcd-client\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.576125 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.576394 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7269408-3cf0-468a-a5d4-2625ff71b408-trusted-ca\") pod \"ingress-operator-5b745b69d9-sxddh\" (UID: \"b7269408-3cf0-468a-a5d4-2625ff71b408\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.576440 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.576402 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6-stats-auth\") pod \"router-default-5444994796-8vlmm\" (UID: \"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6\") " pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.576575 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/eb9a2d0a-5c18-44d4-aa62-922d1937a7a4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kmv44\" (UID: \"eb9a2d0a-5c18-44d4-aa62-922d1937a7a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.576894 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/754c2a0a-7622-4316-9706-e8499dd756a5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.577175 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7425a945-4499-4a87-b745-d31e5dbf9d0e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-glkns\" (UID: \"7425a945-4499-4a87-b745-d31e5dbf9d0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.577306 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/047a41f9-3608-40a0-a1a2-ccdde5061412-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-v7r7n\" (UID: \"047a41f9-3608-40a0-a1a2-ccdde5061412\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.579371 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7269408-3cf0-468a-a5d4-2625ff71b408-metrics-tls\") pod \"ingress-operator-5b745b69d9-sxddh\" (UID: \"b7269408-3cf0-468a-a5d4-2625ff71b408\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.580696 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-registry-tls\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.582501 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.582561 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58e3fff7-b9d0-4107-96ae-a0a00965f574-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bzb6n\" (UID: \"58e3fff7-b9d0-4107-96ae-a0a00965f574\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.583473 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6-metrics-certs\") pod \"router-default-5444994796-8vlmm\" (UID: \"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6\") " pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.583832 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.585483 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6-default-certificate\") pod \"router-default-5444994796-8vlmm\" (UID: \"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6\") " pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.585766 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.590656 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e325b9b-2d5b-4fca-8344-e190bed4cdd2-serving-cert\") pod \"authentication-operator-69f744f599-wrmqz\" (UID: \"4e325b9b-2d5b-4fca-8344-e190bed4cdd2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.593567 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.596807 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-encryption-config\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.602397 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bfb568a-6ad7-41fd-86da-7aaf96ecd991-serving-cert\") pod \"console-operator-58897d9998-6nssv\" (UID: \"3bfb568a-6ad7-41fd-86da-7aaf96ecd991\") " pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.603616 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz9sg\" (UniqueName: \"kubernetes.io/projected/e86d38f0-15ae-4043-a550-54cae8cf4e8d-kube-api-access-rz9sg\") pod \"machine-approver-56656f9798-zvnpr\" (UID: \"e86d38f0-15ae-4043-a550-54cae8cf4e8d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.632099 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xbkh\" (UniqueName: \"kubernetes.io/projected/7d272437-fe00-4ec0-ba5b-35e25d59ccf2-kube-api-access-6xbkh\") pod \"apiserver-7bbb656c7d-2fk8c\" (UID: \"7d272437-fe00-4ec0-ba5b-35e25d59ccf2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.638340 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648354 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648614 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrrlf\" (UniqueName: \"kubernetes.io/projected/24d2a857-eb20-4eb7-acb2-077e53af8b03-kube-api-access-qrrlf\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648655 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpf6w\" (UniqueName: \"kubernetes.io/projected/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-kube-api-access-qpf6w\") pod \"marketplace-operator-79b997595-r68rm\" (UID: \"42351b07-cf74-49fd-b6fd-88b7ef8fdac0\") " pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648682 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d-signing-key\") pod \"service-ca-9c57cc56f-v7m78\" (UID: \"9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d\") " pod="openshift-service-ca/service-ca-9c57cc56f-v7m78" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648704 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br8xx\" (UniqueName: \"kubernetes.io/projected/3f216997-1de4-499d-b5f2-0bacbbdbdd36-kube-api-access-br8xx\") pod \"migrator-59844c95c7-5564v\" (UID: \"3f216997-1de4-499d-b5f2-0bacbbdbdd36\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5564v" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648735 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r68rm\" (UID: \"42351b07-cf74-49fd-b6fd-88b7ef8fdac0\") " pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648758 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhw87\" (UniqueName: \"kubernetes.io/projected/5eb9b7a9-c9b5-4b31-9773-dee9b786ba91-kube-api-access-nhw87\") pod \"machine-config-server-9hqbq\" (UID: \"5eb9b7a9-c9b5-4b31-9773-dee9b786ba91\") " pod="openshift-machine-config-operator/machine-config-server-9hqbq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648782 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f8fe86e-1b21-4358-aa22-4c0939d313f7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45rxt\" (UID: \"4f8fe86e-1b21-4358-aa22-4c0939d313f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648804 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/82abaea9-9d21-432e-a434-d21fe6a7197b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-25wn6\" (UID: \"82abaea9-9d21-432e-a434-d21fe6a7197b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648824 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f4ab691e-904d-49b1-9b3e-57a8271bd791-images\") pod \"machine-config-operator-74547568cd-8r6k8\" (UID: \"f4ab691e-904d-49b1-9b3e-57a8271bd791\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648848 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thkjg\" (UniqueName: \"kubernetes.io/projected/a364be49-d097-438f-858d-77e2bcff5ad0-kube-api-access-thkjg\") pod \"multus-admission-controller-857f4d67dd-7kkdk\" (UID: \"a364be49-d097-438f-858d-77e2bcff5ad0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7kkdk" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648869 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f8fe86e-1b21-4358-aa22-4c0939d313f7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45rxt\" (UID: \"4f8fe86e-1b21-4358-aa22-4c0939d313f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648891 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/82abaea9-9d21-432e-a434-d21fe6a7197b-srv-cert\") pod \"olm-operator-6b444d44fb-25wn6\" (UID: \"82abaea9-9d21-432e-a434-d21fe6a7197b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648919 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/24d2a857-eb20-4eb7-acb2-077e53af8b03-plugins-dir\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648940 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/24d2a857-eb20-4eb7-acb2-077e53af8b03-socket-dir\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648963 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/24d2a857-eb20-4eb7-acb2-077e53af8b03-registration-dir\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.648986 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4ab691e-904d-49b1-9b3e-57a8271bd791-proxy-tls\") pod \"machine-config-operator-74547568cd-8r6k8\" (UID: \"f4ab691e-904d-49b1-9b3e-57a8271bd791\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649008 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5eb9b7a9-c9b5-4b31-9773-dee9b786ba91-node-bootstrap-token\") pod \"machine-config-server-9hqbq\" (UID: \"5eb9b7a9-c9b5-4b31-9773-dee9b786ba91\") " pod="openshift-machine-config-operator/machine-config-server-9hqbq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649029 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65c59658-5ed8-4cef-b36d-2a1e44ec6976-config-volume\") pod \"collect-profiles-29497515-tqk6n\" (UID: \"65c59658-5ed8-4cef-b36d-2a1e44ec6976\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649049 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/24d2a857-eb20-4eb7-acb2-077e53af8b03-csi-data-dir\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649071 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5eb9b7a9-c9b5-4b31-9773-dee9b786ba91-certs\") pod \"machine-config-server-9hqbq\" (UID: \"5eb9b7a9-c9b5-4b31-9773-dee9b786ba91\") " pod="openshift-machine-config-operator/machine-config-server-9hqbq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649095 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1993d67-8009-4a0c-90b2-517e1504bc2a-config-volume\") pod \"dns-default-2bwrf\" (UID: \"c1993d67-8009-4a0c-90b2-517e1504bc2a\") " pod="openshift-dns/dns-default-2bwrf" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649129 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qql4s\" (UniqueName: \"kubernetes.io/projected/46b9e525-ef69-4600-92c7-8eb418824669-kube-api-access-qql4s\") pod \"machine-config-controller-84d6567774-dntz7\" (UID: \"46b9e525-ef69-4600-92c7-8eb418824669\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649167 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/46b9e525-ef69-4600-92c7-8eb418824669-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dntz7\" (UID: \"46b9e525-ef69-4600-92c7-8eb418824669\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649188 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpc5j\" (UniqueName: \"kubernetes.io/projected/c1993d67-8009-4a0c-90b2-517e1504bc2a-kube-api-access-hpc5j\") pod \"dns-default-2bwrf\" (UID: \"c1993d67-8009-4a0c-90b2-517e1504bc2a\") " pod="openshift-dns/dns-default-2bwrf" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649208 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4ab691e-904d-49b1-9b3e-57a8271bd791-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8r6k8\" (UID: \"f4ab691e-904d-49b1-9b3e-57a8271bd791\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649227 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/24d2a857-eb20-4eb7-acb2-077e53af8b03-mountpoint-dir\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649251 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d-signing-cabundle\") pod \"service-ca-9c57cc56f-v7m78\" (UID: \"9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d\") " pod="openshift-service-ca/service-ca-9c57cc56f-v7m78" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649270 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/65c59658-5ed8-4cef-b36d-2a1e44ec6976-secret-volume\") pod \"collect-profiles-29497515-tqk6n\" (UID: \"65c59658-5ed8-4cef-b36d-2a1e44ec6976\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649297 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9398e230-5d76-4418-9807-be17513913c0-profile-collector-cert\") pod \"catalog-operator-68c6474976-45qvp\" (UID: \"9398e230-5d76-4418-9807-be17513913c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649344 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fbfb9fcb-64ca-4640-a230-4212004d2494-apiservice-cert\") pod \"packageserver-d55dfcdfc-kc7bw\" (UID: \"fbfb9fcb-64ca-4640-a230-4212004d2494\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649367 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qh79\" (UniqueName: \"kubernetes.io/projected/9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d-kube-api-access-2qh79\") pod \"service-ca-9c57cc56f-v7m78\" (UID: \"9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d\") " pod="openshift-service-ca/service-ca-9c57cc56f-v7m78" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649396 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24b61f27-cb4d-4611-a0e8-14f618385f83-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ddxxb\" (UID: \"24b61f27-cb4d-4611-a0e8-14f618385f83\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649587 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7edcde3-ebb8-4a50-a75f-28539482b78b-cert\") pod \"ingress-canary-np7kv\" (UID: \"e7edcde3-ebb8-4a50-a75f-28539482b78b\") " pod="openshift-ingress-canary/ingress-canary-np7kv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649620 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkldh\" (UniqueName: \"kubernetes.io/projected/0a88cb09-555e-4fd5-9ffc-fcff02f2bf35-kube-api-access-nkldh\") pod \"dns-operator-744455d44c-tq5sk\" (UID: \"0a88cb09-555e-4fd5-9ffc-fcff02f2bf35\") " pod="openshift-dns-operator/dns-operator-744455d44c-tq5sk" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649642 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fbfb9fcb-64ca-4640-a230-4212004d2494-tmpfs\") pod \"packageserver-d55dfcdfc-kc7bw\" (UID: \"fbfb9fcb-64ca-4640-a230-4212004d2494\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649673 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r68rm\" (UID: \"42351b07-cf74-49fd-b6fd-88b7ef8fdac0\") " pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649695 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/46b9e525-ef69-4600-92c7-8eb418824669-proxy-tls\") pod \"machine-config-controller-84d6567774-dntz7\" (UID: \"46b9e525-ef69-4600-92c7-8eb418824669\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649723 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9398e230-5d76-4418-9807-be17513913c0-srv-cert\") pod \"catalog-operator-68c6474976-45qvp\" (UID: \"9398e230-5d76-4418-9807-be17513913c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649756 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnx7p\" (UniqueName: \"kubernetes.io/projected/90220a85-3f5b-4360-9cc9-d9c9a65db928-kube-api-access-tnx7p\") pod \"package-server-manager-789f6589d5-b99x8\" (UID: \"90220a85-3f5b-4360-9cc9-d9c9a65db928\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649775 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgv4k\" (UniqueName: \"kubernetes.io/projected/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-kube-api-access-qgv4k\") pod \"controller-manager-879f6c89f-srdfh\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649784 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j45l\" (UniqueName: \"kubernetes.io/projected/9398e230-5d76-4418-9807-be17513913c0-kube-api-access-4j45l\") pod \"catalog-operator-68c6474976-45qvp\" (UID: \"9398e230-5d76-4418-9807-be17513913c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649859 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwbdk\" (UniqueName: \"kubernetes.io/projected/9fb8fd57-5826-40cd-b62d-2a53e9e0c72c-kube-api-access-jwbdk\") pod \"control-plane-machine-set-operator-78cbb6b69f-h8jxh\" (UID: \"9fb8fd57-5826-40cd-b62d-2a53e9e0c72c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h8jxh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649883 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5z97\" (UniqueName: \"kubernetes.io/projected/24b61f27-cb4d-4611-a0e8-14f618385f83-kube-api-access-l5z97\") pod \"kube-storage-version-migrator-operator-b67b599dd-ddxxb\" (UID: \"24b61f27-cb4d-4611-a0e8-14f618385f83\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649901 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f8fe86e-1b21-4358-aa22-4c0939d313f7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45rxt\" (UID: \"4f8fe86e-1b21-4358-aa22-4c0939d313f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649920 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a924bba-57c2-4c3b-9560-ab10bed041cf-serving-cert\") pod \"service-ca-operator-777779d784-jr8pm\" (UID: \"3a924bba-57c2-4c3b-9560-ab10bed041cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.649957 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/90220a85-3f5b-4360-9cc9-d9c9a65db928-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b99x8\" (UID: \"90220a85-3f5b-4360-9cc9-d9c9a65db928\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8" Jan 31 09:27:26 crc kubenswrapper[4992]: E0131 09:27:26.649991 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.149972111 +0000 UTC m=+143.121364168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.650020 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.650051 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrbst\" (UniqueName: \"kubernetes.io/projected/f4ab691e-904d-49b1-9b3e-57a8271bd791-kube-api-access-hrbst\") pod \"machine-config-operator-74547568cd-8r6k8\" (UID: \"f4ab691e-904d-49b1-9b3e-57a8271bd791\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.650083 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fbfb9fcb-64ca-4640-a230-4212004d2494-webhook-cert\") pod \"packageserver-d55dfcdfc-kc7bw\" (UID: \"fbfb9fcb-64ca-4640-a230-4212004d2494\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.650108 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhnsb\" (UniqueName: \"kubernetes.io/projected/65c59658-5ed8-4cef-b36d-2a1e44ec6976-kube-api-access-rhnsb\") pod \"collect-profiles-29497515-tqk6n\" (UID: \"65c59658-5ed8-4cef-b36d-2a1e44ec6976\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.650138 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24b61f27-cb4d-4611-a0e8-14f618385f83-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ddxxb\" (UID: \"24b61f27-cb4d-4611-a0e8-14f618385f83\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.650159 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g2f6\" (UniqueName: \"kubernetes.io/projected/82abaea9-9d21-432e-a434-d21fe6a7197b-kube-api-access-8g2f6\") pod \"olm-operator-6b444d44fb-25wn6\" (UID: \"82abaea9-9d21-432e-a434-d21fe6a7197b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.650191 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9fb8fd57-5826-40cd-b62d-2a53e9e0c72c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h8jxh\" (UID: \"9fb8fd57-5826-40cd-b62d-2a53e9e0c72c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h8jxh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.650213 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a88cb09-555e-4fd5-9ffc-fcff02f2bf35-metrics-tls\") pod \"dns-operator-744455d44c-tq5sk\" (UID: \"0a88cb09-555e-4fd5-9ffc-fcff02f2bf35\") " pod="openshift-dns-operator/dns-operator-744455d44c-tq5sk" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.650236 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6mtl\" (UniqueName: \"kubernetes.io/projected/fbfb9fcb-64ca-4640-a230-4212004d2494-kube-api-access-n6mtl\") pod \"packageserver-d55dfcdfc-kc7bw\" (UID: \"fbfb9fcb-64ca-4640-a230-4212004d2494\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.650258 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c1993d67-8009-4a0c-90b2-517e1504bc2a-metrics-tls\") pod \"dns-default-2bwrf\" (UID: \"c1993d67-8009-4a0c-90b2-517e1504bc2a\") " pod="openshift-dns/dns-default-2bwrf" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.650278 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a924bba-57c2-4c3b-9560-ab10bed041cf-config\") pod \"service-ca-operator-777779d784-jr8pm\" (UID: \"3a924bba-57c2-4c3b-9560-ab10bed041cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.650321 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tksxk\" (UniqueName: \"kubernetes.io/projected/3a924bba-57c2-4c3b-9560-ab10bed041cf-kube-api-access-tksxk\") pod \"service-ca-operator-777779d784-jr8pm\" (UID: \"3a924bba-57c2-4c3b-9560-ab10bed041cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.650350 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a364be49-d097-438f-858d-77e2bcff5ad0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7kkdk\" (UID: \"a364be49-d097-438f-858d-77e2bcff5ad0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7kkdk" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.650374 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdwx8\" (UniqueName: \"kubernetes.io/projected/e7edcde3-ebb8-4a50-a75f-28539482b78b-kube-api-access-qdwx8\") pod \"ingress-canary-np7kv\" (UID: \"e7edcde3-ebb8-4a50-a75f-28539482b78b\") " pod="openshift-ingress-canary/ingress-canary-np7kv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.651257 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/24d2a857-eb20-4eb7-acb2-077e53af8b03-registration-dir\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.651744 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f8fe86e-1b21-4358-aa22-4c0939d313f7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45rxt\" (UID: \"4f8fe86e-1b21-4358-aa22-4c0939d313f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.652039 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1993d67-8009-4a0c-90b2-517e1504bc2a-config-volume\") pod \"dns-default-2bwrf\" (UID: \"c1993d67-8009-4a0c-90b2-517e1504bc2a\") " pod="openshift-dns/dns-default-2bwrf" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.652577 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4ab691e-904d-49b1-9b3e-57a8271bd791-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8r6k8\" (UID: \"f4ab691e-904d-49b1-9b3e-57a8271bd791\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.652614 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-r68rm\" (UID: \"42351b07-cf74-49fd-b6fd-88b7ef8fdac0\") " pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.652692 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/24d2a857-eb20-4eb7-acb2-077e53af8b03-plugins-dir\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.652748 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/24d2a857-eb20-4eb7-acb2-077e53af8b03-socket-dir\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.652910 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/46b9e525-ef69-4600-92c7-8eb418824669-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dntz7\" (UID: \"46b9e525-ef69-4600-92c7-8eb418824669\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.652977 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/24d2a857-eb20-4eb7-acb2-077e53af8b03-mountpoint-dir\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.653362 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fbfb9fcb-64ca-4640-a230-4212004d2494-tmpfs\") pod \"packageserver-d55dfcdfc-kc7bw\" (UID: \"fbfb9fcb-64ca-4640-a230-4212004d2494\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.653608 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/24d2a857-eb20-4eb7-acb2-077e53af8b03-csi-data-dir\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.653636 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65c59658-5ed8-4cef-b36d-2a1e44ec6976-config-volume\") pod \"collect-profiles-29497515-tqk6n\" (UID: \"65c59658-5ed8-4cef-b36d-2a1e44ec6976\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.653714 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d-signing-cabundle\") pod \"service-ca-9c57cc56f-v7m78\" (UID: \"9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d\") " pod="openshift-service-ca/service-ca-9c57cc56f-v7m78" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.654895 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24b61f27-cb4d-4611-a0e8-14f618385f83-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ddxxb\" (UID: \"24b61f27-cb4d-4611-a0e8-14f618385f83\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb" Jan 31 09:27:26 crc kubenswrapper[4992]: E0131 09:27:26.655942 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.155928576 +0000 UTC m=+143.127320563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.656564 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f4ab691e-904d-49b1-9b3e-57a8271bd791-images\") pod \"machine-config-operator-74547568cd-8r6k8\" (UID: \"f4ab691e-904d-49b1-9b3e-57a8271bd791\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.658737 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/65c59658-5ed8-4cef-b36d-2a1e44ec6976-secret-volume\") pod \"collect-profiles-29497515-tqk6n\" (UID: \"65c59658-5ed8-4cef-b36d-2a1e44ec6976\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.659298 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4ab691e-904d-49b1-9b3e-57a8271bd791-proxy-tls\") pod \"machine-config-operator-74547568cd-8r6k8\" (UID: \"f4ab691e-904d-49b1-9b3e-57a8271bd791\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.660140 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d-signing-key\") pod \"service-ca-9c57cc56f-v7m78\" (UID: \"9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d\") " pod="openshift-service-ca/service-ca-9c57cc56f-v7m78" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.660711 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-r68rm\" (UID: \"42351b07-cf74-49fd-b6fd-88b7ef8fdac0\") " pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.661795 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9fb8fd57-5826-40cd-b62d-2a53e9e0c72c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h8jxh\" (UID: \"9fb8fd57-5826-40cd-b62d-2a53e9e0c72c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h8jxh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.663157 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/90220a85-3f5b-4360-9cc9-d9c9a65db928-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b99x8\" (UID: \"90220a85-3f5b-4360-9cc9-d9c9a65db928\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.664561 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a924bba-57c2-4c3b-9560-ab10bed041cf-config\") pod \"service-ca-operator-777779d784-jr8pm\" (UID: \"3a924bba-57c2-4c3b-9560-ab10bed041cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.665743 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f8fe86e-1b21-4358-aa22-4c0939d313f7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45rxt\" (UID: \"4f8fe86e-1b21-4358-aa22-4c0939d313f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.665825 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c1993d67-8009-4a0c-90b2-517e1504bc2a-metrics-tls\") pod \"dns-default-2bwrf\" (UID: \"c1993d67-8009-4a0c-90b2-517e1504bc2a\") " pod="openshift-dns/dns-default-2bwrf" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.665874 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/82abaea9-9d21-432e-a434-d21fe6a7197b-srv-cert\") pod \"olm-operator-6b444d44fb-25wn6\" (UID: \"82abaea9-9d21-432e-a434-d21fe6a7197b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.665998 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a924bba-57c2-4c3b-9560-ab10bed041cf-serving-cert\") pod \"service-ca-operator-777779d784-jr8pm\" (UID: \"3a924bba-57c2-4c3b-9560-ab10bed041cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.666370 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9398e230-5d76-4418-9807-be17513913c0-srv-cert\") pod \"catalog-operator-68c6474976-45qvp\" (UID: \"9398e230-5d76-4418-9807-be17513913c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.666709 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9398e230-5d76-4418-9807-be17513913c0-profile-collector-cert\") pod \"catalog-operator-68c6474976-45qvp\" (UID: \"9398e230-5d76-4418-9807-be17513913c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.669205 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5eb9b7a9-c9b5-4b31-9773-dee9b786ba91-node-bootstrap-token\") pod \"machine-config-server-9hqbq\" (UID: \"5eb9b7a9-c9b5-4b31-9773-dee9b786ba91\") " pod="openshift-machine-config-operator/machine-config-server-9hqbq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.669281 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7edcde3-ebb8-4a50-a75f-28539482b78b-cert\") pod \"ingress-canary-np7kv\" (UID: \"e7edcde3-ebb8-4a50-a75f-28539482b78b\") " pod="openshift-ingress-canary/ingress-canary-np7kv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.669731 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/82abaea9-9d21-432e-a434-d21fe6a7197b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-25wn6\" (UID: \"82abaea9-9d21-432e-a434-d21fe6a7197b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.669749 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/46b9e525-ef69-4600-92c7-8eb418824669-proxy-tls\") pod \"machine-config-controller-84d6567774-dntz7\" (UID: \"46b9e525-ef69-4600-92c7-8eb418824669\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.671521 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24b61f27-cb4d-4611-a0e8-14f618385f83-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ddxxb\" (UID: \"24b61f27-cb4d-4611-a0e8-14f618385f83\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.672549 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltqsj\" (UniqueName: \"kubernetes.io/projected/eb9a2d0a-5c18-44d4-aa62-922d1937a7a4-kube-api-access-ltqsj\") pod \"cluster-samples-operator-665b6dd947-kmv44\" (UID: \"eb9a2d0a-5c18-44d4-aa62-922d1937a7a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.673527 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fbfb9fcb-64ca-4640-a230-4212004d2494-apiservice-cert\") pod \"packageserver-d55dfcdfc-kc7bw\" (UID: \"fbfb9fcb-64ca-4640-a230-4212004d2494\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.673991 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a364be49-d097-438f-858d-77e2bcff5ad0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7kkdk\" (UID: \"a364be49-d097-438f-858d-77e2bcff5ad0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7kkdk" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.676975 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5eb9b7a9-c9b5-4b31-9773-dee9b786ba91-certs\") pod \"machine-config-server-9hqbq\" (UID: \"5eb9b7a9-c9b5-4b31-9773-dee9b786ba91\") " pod="openshift-machine-config-operator/machine-config-server-9hqbq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.693448 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7269408-3cf0-468a-a5d4-2625ff71b408-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sxddh\" (UID: \"b7269408-3cf0-468a-a5d4-2625ff71b408\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.708580 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fxvn\" (UniqueName: \"kubernetes.io/projected/58e3fff7-b9d0-4107-96ae-a0a00965f574-kube-api-access-5fxvn\") pod \"openshift-apiserver-operator-796bbdcf4f-bzb6n\" (UID: \"58e3fff7-b9d0-4107-96ae-a0a00965f574\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.712922 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a88cb09-555e-4fd5-9ffc-fcff02f2bf35-metrics-tls\") pod \"dns-operator-744455d44c-tq5sk\" (UID: \"0a88cb09-555e-4fd5-9ffc-fcff02f2bf35\") " pod="openshift-dns-operator/dns-operator-744455d44c-tq5sk" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.716847 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fbfb9fcb-64ca-4640-a230-4212004d2494-webhook-cert\") pod \"packageserver-d55dfcdfc-kc7bw\" (UID: \"fbfb9fcb-64ca-4640-a230-4212004d2494\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.716966 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-serving-cert\") pod \"controller-manager-879f6c89f-srdfh\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.717634 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsnbx\" (UniqueName: \"kubernetes.io/projected/db3860c3-37de-4fa5-9c79-965abd0e2149-kube-api-access-rsnbx\") pod \"oauth-openshift-558db77b4-vktdq\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.729605 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-bound-sa-token\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.749521 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzrf5\" (UniqueName: \"kubernetes.io/projected/1aa0742f-dd1e-46bf-ba75-5368a621cb89-kube-api-access-nzrf5\") pod \"etcd-operator-b45778765-2m8dn\" (UID: \"1aa0742f-dd1e-46bf-ba75-5368a621cb89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.750980 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:26 crc kubenswrapper[4992]: E0131 09:27:26.751676 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.251641391 +0000 UTC m=+143.223033378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.759937 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.788952 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8npb\" (UniqueName: \"kubernetes.io/projected/b7269408-3cf0-468a-a5d4-2625ff71b408-kube-api-access-n8npb\") pod \"ingress-operator-5b745b69d9-sxddh\" (UID: \"b7269408-3cf0-468a-a5d4-2625ff71b408\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.814186 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m47f4\" (UniqueName: \"kubernetes.io/projected/dd243542-ca16-4b95-9fa1-b579ee3cca2e-kube-api-access-m47f4\") pod \"downloads-7954f5f757-jgrjj\" (UID: \"dd243542-ca16-4b95-9fa1-b579ee3cca2e\") " pod="openshift-console/downloads-7954f5f757-jgrjj" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.814509 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.825553 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.833542 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w459n\" (UniqueName: \"kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-kube-api-access-w459n\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.852795 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.853092 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvk94\" (UniqueName: \"kubernetes.io/projected/4e325b9b-2d5b-4fca-8344-e190bed4cdd2-kube-api-access-tvk94\") pod \"authentication-operator-69f744f599-wrmqz\" (UID: \"4e325b9b-2d5b-4fca-8344-e190bed4cdd2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: E0131 09:27:26.853141 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.353128306 +0000 UTC m=+143.324520293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.860622 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jgrjj" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.864561 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk"] Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.875061 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd4qw\" (UniqueName: \"kubernetes.io/projected/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-kube-api-access-cd4qw\") pod \"console-f9d7485db-7bjlw\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.890353 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52dff\" (UniqueName: \"kubernetes.io/projected/b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6-kube-api-access-52dff\") pod \"router-default-5444994796-8vlmm\" (UID: \"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6\") " pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.892780 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.894568 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.909070 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt"] Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.909120 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-68zwk"] Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.913241 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.915069 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxjmn\" (UniqueName: \"kubernetes.io/projected/7425a945-4499-4a87-b745-d31e5dbf9d0e-kube-api-access-rxjmn\") pod \"machine-api-operator-5694c8668f-glkns\" (UID: \"7425a945-4499-4a87-b745-d31e5dbf9d0e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.928807 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp"] Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.931079 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv"] Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.938306 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hvtz\" (UniqueName: \"kubernetes.io/projected/3bfb568a-6ad7-41fd-86da-7aaf96ecd991-kube-api-access-8hvtz\") pod \"console-operator-58897d9998-6nssv\" (UID: \"3bfb568a-6ad7-41fd-86da-7aaf96ecd991\") " pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.938452 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.945819 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.955686 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.955924 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:26 crc kubenswrapper[4992]: E0131 09:27:26.956283 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.456241608 +0000 UTC m=+143.427633595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.956638 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:26 crc kubenswrapper[4992]: E0131 09:27:26.960358 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.460297887 +0000 UTC m=+143.431689884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.973250 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.979899 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j45l\" (UniqueName: \"kubernetes.io/projected/9398e230-5d76-4418-9807-be17513913c0-kube-api-access-4j45l\") pod \"catalog-operator-68c6474976-45qvp\" (UID: \"9398e230-5d76-4418-9807-be17513913c0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" Jan 31 09:27:26 crc kubenswrapper[4992]: I0131 09:27:26.990242 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bps8z\" (UniqueName: \"kubernetes.io/projected/047a41f9-3608-40a0-a1a2-ccdde5061412-kube-api-access-bps8z\") pod \"openshift-controller-manager-operator-756b6f6bc6-v7r7n\" (UID: \"047a41f9-3608-40a0-a1a2-ccdde5061412\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:26.997193 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdwx8\" (UniqueName: \"kubernetes.io/projected/e7edcde3-ebb8-4a50-a75f-28539482b78b-kube-api-access-qdwx8\") pod \"ingress-canary-np7kv\" (UID: \"e7edcde3-ebb8-4a50-a75f-28539482b78b\") " pod="openshift-ingress-canary/ingress-canary-np7kv" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.009713 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" event={"ID":"3c397c46-5579-414a-aca9-3822b9e603ea","Type":"ContainerStarted","Data":"6eabb2fef30805cd1ec94df1aba7ad87d4a2d5adbfc5f7b41bc26017116454e9"} Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.010390 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp" event={"ID":"545db117-1eda-49e5-96e5-223285792b1c","Type":"ContainerStarted","Data":"f80a2edc9d2f34f0bc5037737b6a69c5eaa506187734d304c3aaed7e9b0731d0"} Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.016214 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" event={"ID":"e86d38f0-15ae-4043-a550-54cae8cf4e8d","Type":"ContainerStarted","Data":"a13d881c180471562b02a9951969853ca7e83412e8e40eb2df57ffa381a9d83a"} Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.017135 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" event={"ID":"fd0c1366-dfbf-487e-98a4-94fb4be75045","Type":"ContainerStarted","Data":"b77a94176075eb7077a3fc6157d08842633c7867dec7e84341500a823ed2d627"} Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.022751 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt" event={"ID":"21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66","Type":"ContainerStarted","Data":"761aca2642fe7b08e8fe1ccdeca3876b2ecfae585d7e9512cc4bbf3eab9583e0"} Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.023726 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-68zwk" event={"ID":"6e0e4407-bfda-4d16-9e9e-d9065286a07d","Type":"ContainerStarted","Data":"45c94752eb46e8f11f31b7d441b76ddc27ddcacec09dae26646099e51fe00252"} Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.049915 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5z97\" (UniqueName: \"kubernetes.io/projected/24b61f27-cb4d-4611-a0e8-14f618385f83-kube-api-access-l5z97\") pod \"kube-storage-version-migrator-operator-b67b599dd-ddxxb\" (UID: \"24b61f27-cb4d-4611-a0e8-14f618385f83\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.052649 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.059229 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.059366 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.55934557 +0000 UTC m=+143.530737567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.059627 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.060005 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.559994059 +0000 UTC m=+143.531386046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.069313 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwbdk\" (UniqueName: \"kubernetes.io/projected/9fb8fd57-5826-40cd-b62d-2a53e9e0c72c-kube-api-access-jwbdk\") pod \"control-plane-machine-set-operator-78cbb6b69f-h8jxh\" (UID: \"9fb8fd57-5826-40cd-b62d-2a53e9e0c72c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h8jxh" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.089577 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.091070 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br8xx\" (UniqueName: \"kubernetes.io/projected/3f216997-1de4-499d-b5f2-0bacbbdbdd36-kube-api-access-br8xx\") pod \"migrator-59844c95c7-5564v\" (UID: \"3f216997-1de4-499d-b5f2-0bacbbdbdd36\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5564v" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.110944 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qql4s\" (UniqueName: \"kubernetes.io/projected/46b9e525-ef69-4600-92c7-8eb418824669-kube-api-access-qql4s\") pod \"machine-config-controller-84d6567774-dntz7\" (UID: \"46b9e525-ef69-4600-92c7-8eb418824669\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.111764 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.129360 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thkjg\" (UniqueName: \"kubernetes.io/projected/a364be49-d097-438f-858d-77e2bcff5ad0-kube-api-access-thkjg\") pod \"multus-admission-controller-857f4d67dd-7kkdk\" (UID: \"a364be49-d097-438f-858d-77e2bcff5ad0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7kkdk" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.133442 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.146262 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.149083 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhw87\" (UniqueName: \"kubernetes.io/projected/5eb9b7a9-c9b5-4b31-9773-dee9b786ba91-kube-api-access-nhw87\") pod \"machine-config-server-9hqbq\" (UID: \"5eb9b7a9-c9b5-4b31-9773-dee9b786ba91\") " pod="openshift-machine-config-operator/machine-config-server-9hqbq" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.160015 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.160159 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.660133654 +0000 UTC m=+143.631525641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.160297 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.160691 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.660683641 +0000 UTC m=+143.632075628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.164896 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpc5j\" (UniqueName: \"kubernetes.io/projected/c1993d67-8009-4a0c-90b2-517e1504bc2a-kube-api-access-hpc5j\") pod \"dns-default-2bwrf\" (UID: \"c1993d67-8009-4a0c-90b2-517e1504bc2a\") " pod="openshift-dns/dns-default-2bwrf" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.167993 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-np7kv" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.170635 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrrlf\" (UniqueName: \"kubernetes.io/projected/24d2a857-eb20-4eb7-acb2-077e53af8b03-kube-api-access-qrrlf\") pod \"csi-hostpathplugin-cb5lw\" (UID: \"24d2a857-eb20-4eb7-acb2-077e53af8b03\") " pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.171709 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpf6w\" (UniqueName: \"kubernetes.io/projected/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-kube-api-access-qpf6w\") pod \"marketplace-operator-79b997595-r68rm\" (UID: \"42351b07-cf74-49fd-b6fd-88b7ef8fdac0\") " pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.186962 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrbst\" (UniqueName: \"kubernetes.io/projected/f4ab691e-904d-49b1-9b3e-57a8271bd791-kube-api-access-hrbst\") pod \"machine-config-operator-74547568cd-8r6k8\" (UID: \"f4ab691e-904d-49b1-9b3e-57a8271bd791\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" Jan 31 09:27:27 crc kubenswrapper[4992]: W0131 09:27:27.190605 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc39f4282_81b2_41a4_8283_5851f4005972.slice/crio-c2dbcb7ee8bbb8b4a83f5c28c36a6912ca446531106683fd02b0b6344427a8dc WatchSource:0}: Error finding container c2dbcb7ee8bbb8b4a83f5c28c36a6912ca446531106683fd02b0b6344427a8dc: Status 404 returned error can't find the container with id c2dbcb7ee8bbb8b4a83f5c28c36a6912ca446531106683fd02b0b6344427a8dc Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.201647 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.214936 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qh79\" (UniqueName: \"kubernetes.io/projected/9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d-kube-api-access-2qh79\") pod \"service-ca-9c57cc56f-v7m78\" (UID: \"9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d\") " pod="openshift-service-ca/service-ca-9c57cc56f-v7m78" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.234932 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhnsb\" (UniqueName: \"kubernetes.io/projected/65c59658-5ed8-4cef-b36d-2a1e44ec6976-kube-api-access-rhnsb\") pod \"collect-profiles-29497515-tqk6n\" (UID: \"65c59658-5ed8-4cef-b36d-2a1e44ec6976\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.242993 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wrmqz"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.265206 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-srdfh"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.265321 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.265530 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.765512473 +0000 UTC m=+143.736904460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.265604 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.266125 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.766112311 +0000 UTC m=+143.737504298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.273643 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6mtl\" (UniqueName: \"kubernetes.io/projected/fbfb9fcb-64ca-4640-a230-4212004d2494-kube-api-access-n6mtl\") pod \"packageserver-d55dfcdfc-kc7bw\" (UID: \"fbfb9fcb-64ca-4640-a230-4212004d2494\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.275128 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.279198 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.289643 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.294458 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f8fe86e-1b21-4358-aa22-4c0939d313f7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-45rxt\" (UID: \"4f8fe86e-1b21-4358-aa22-4c0939d313f7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.298242 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g2f6\" (UniqueName: \"kubernetes.io/projected/82abaea9-9d21-432e-a434-d21fe6a7197b-kube-api-access-8g2f6\") pod \"olm-operator-6b444d44fb-25wn6\" (UID: \"82abaea9-9d21-432e-a434-d21fe6a7197b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.299804 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7kkdk" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.308173 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnx7p\" (UniqueName: \"kubernetes.io/projected/90220a85-3f5b-4360-9cc9-d9c9a65db928-kube-api-access-tnx7p\") pod \"package-server-manager-789f6589d5-b99x8\" (UID: \"90220a85-3f5b-4360-9cc9-d9c9a65db928\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.319567 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.327198 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.352138 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5564v" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.352540 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tksxk\" (UniqueName: \"kubernetes.io/projected/3a924bba-57c2-4c3b-9560-ab10bed041cf-kube-api-access-tksxk\") pod \"service-ca-operator-777779d784-jr8pm\" (UID: \"3a924bba-57c2-4c3b-9560-ab10bed041cf\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.352832 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2bwrf" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.356144 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2m8dn"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.362131 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkldh\" (UniqueName: \"kubernetes.io/projected/0a88cb09-555e-4fd5-9ffc-fcff02f2bf35-kube-api-access-nkldh\") pod \"dns-operator-744455d44c-tq5sk\" (UID: \"0a88cb09-555e-4fd5-9ffc-fcff02f2bf35\") " pod="openshift-dns-operator/dns-operator-744455d44c-tq5sk" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.363460 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h8jxh" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.365045 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jgrjj"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.367580 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.367716 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.867694288 +0000 UTC m=+143.839086275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.367865 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.368260 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm" Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.368367 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.868264315 +0000 UTC m=+143.839656302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.368632 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.378174 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.391471 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vktdq"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.392952 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.399990 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:27 crc kubenswrapper[4992]: W0131 09:27:27.403978 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e325b9b_2d5b_4fca_8344_e190bed4cdd2.slice/crio-23b8825e13b78225b1a0ec6411753acc38f3d7b6527be1f5c1858311e3aff5d8 WatchSource:0}: Error finding container 23b8825e13b78225b1a0ec6411753acc38f3d7b6527be1f5c1858311e3aff5d8: Status 404 returned error can't find the container with id 23b8825e13b78225b1a0ec6411753acc38f3d7b6527be1f5c1858311e3aff5d8 Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.410595 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.424225 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.432410 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-v7m78" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.443439 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9hqbq" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.447630 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.447693 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-glkns"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.461407 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.468884 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.469096 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.96907467 +0000 UTC m=+143.940466657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.469143 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.469515 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:27.969506012 +0000 UTC m=+143.940897989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:27 crc kubenswrapper[4992]: W0131 09:27:27.538682 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7269408_3cf0_468a_a5d4_2625ff71b408.slice/crio-e44dbfce65490bf0ffe6c16de370280459d37e481243e29cc34fb8cb9282bbf0 WatchSource:0}: Error finding container e44dbfce65490bf0ffe6c16de370280459d37e481243e29cc34fb8cb9282bbf0: Status 404 returned error can't find the container with id e44dbfce65490bf0ffe6c16de370280459d37e481243e29cc34fb8cb9282bbf0 Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.571720 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.571965 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:28.071938305 +0000 UTC m=+144.043330292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.572447 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.572854 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:28.072839071 +0000 UTC m=+144.044231058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.610095 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-tq5sk" Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.635785 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6nssv"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.673074 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.674255 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:28.174213663 +0000 UTC m=+144.145605640 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.693729 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7bjlw"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.722186 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.744525 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.775210 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.775506 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:28.275492191 +0000 UTC m=+144.246884178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.809359 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-np7kv"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.877435 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.877707 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:28.377681917 +0000 UTC m=+144.349073904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.973088 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8"] Jan 31 09:27:27 crc kubenswrapper[4992]: I0131 09:27:27.981969 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:27 crc kubenswrapper[4992]: E0131 09:27:27.982335 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:28.482323794 +0000 UTC m=+144.453715781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.046651 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" event={"ID":"fd0c1366-dfbf-487e-98a4-94fb4be75045","Type":"ContainerStarted","Data":"882f2f6ca98c739222008605ab5cb4b9fbce788acd60730fad06babc4f6cf433"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.055953 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" event={"ID":"4e325b9b-2d5b-4fca-8344-e190bed4cdd2","Type":"ContainerStarted","Data":"21ed2d16c985a24c6bfbde959f6ed2d29613386a9a9eeb2021207ebbead0dff2"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.055987 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" event={"ID":"4e325b9b-2d5b-4fca-8344-e190bed4cdd2","Type":"ContainerStarted","Data":"23b8825e13b78225b1a0ec6411753acc38f3d7b6527be1f5c1858311e3aff5d8"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.090817 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:28 crc kubenswrapper[4992]: E0131 09:27:28.091638 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:28.591616437 +0000 UTC m=+144.563008424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.107106 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5564v"] Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.123325 4992 generic.go:334] "Generic (PLEG): container finished" podID="6e0e4407-bfda-4d16-9e9e-d9065286a07d" containerID="2ab55b23c7e49fe27d7c83634d933cf13231ca16a92166d67973560fd32783f1" exitCode=0 Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.123471 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-68zwk" event={"ID":"6e0e4407-bfda-4d16-9e9e-d9065286a07d","Type":"ContainerDied","Data":"2ab55b23c7e49fe27d7c83634d933cf13231ca16a92166d67973560fd32783f1"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.125537 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6nssv" event={"ID":"3bfb568a-6ad7-41fd-86da-7aaf96ecd991","Type":"ContainerStarted","Data":"6bfdac9bf8135541669d0410c194dfbac2038fb05933dcd8d8f7dda3d8007cc0"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.144636 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" event={"ID":"c39f4282-81b2-41a4-8283-5851f4005972","Type":"ContainerStarted","Data":"409ce6421da1b9083d01d3664e7a1d6d2df06058d57214ec3bc5de9fc9b590ad"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.144685 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" event={"ID":"c39f4282-81b2-41a4-8283-5851f4005972","Type":"ContainerStarted","Data":"c2dbcb7ee8bbb8b4a83f5c28c36a6912ca446531106683fd02b0b6344427a8dc"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.149442 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8"] Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.150055 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n" event={"ID":"58e3fff7-b9d0-4107-96ae-a0a00965f574","Type":"ContainerStarted","Data":"543c3126f96ccadd586a1aa1e062c5094897103b3a66eba81c1c6a430c3071d0"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.152350 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-np7kv" event={"ID":"e7edcde3-ebb8-4a50-a75f-28539482b78b","Type":"ContainerStarted","Data":"82f130800af9c51fdd16057096d266937b6ed4ae5ad24198fa2cca0e8e7c5f0c"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.158375 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n" event={"ID":"047a41f9-3608-40a0-a1a2-ccdde5061412","Type":"ContainerStarted","Data":"efac33f1805243a96ddf7cb7e217071c1b4f4e08b4f08201e9a199102857243f"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.168244 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" event={"ID":"7425a945-4499-4a87-b745-d31e5dbf9d0e","Type":"ContainerStarted","Data":"5f3147fa571079f9163ad2bdaec955e80bd5198d26cfb30457368296f6837248"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.179653 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" event={"ID":"e86d38f0-15ae-4043-a550-54cae8cf4e8d","Type":"ContainerStarted","Data":"d11b3208c64c80ec088e1aca0cfddeabc6f8b36f1d03ab1e2811f783720b5ae8"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.183497 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" event={"ID":"db3860c3-37de-4fa5-9c79-965abd0e2149","Type":"ContainerStarted","Data":"beccbd9b3c22c35e049051905fc1b8b6b84891beeb328fe8e26316935316466f"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.185121 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8vlmm" event={"ID":"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6","Type":"ContainerStarted","Data":"7ba722d3d95209fe8d3771fa6eb94139b7043326b42270cca5c7d32909e3db8d"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.189059 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-v7m78"] Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.191037 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" event={"ID":"9398e230-5d76-4418-9807-be17513913c0","Type":"ContainerStarted","Data":"c1798cb43e72939f79e4dc6c030a773f6015b46510818bb5dcdad272393d95af"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.192606 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:28 crc kubenswrapper[4992]: E0131 09:27:28.193463 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:28.693451852 +0000 UTC m=+144.664843839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.194210 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" event={"ID":"b7269408-3cf0-468a-a5d4-2625ff71b408","Type":"ContainerStarted","Data":"e44dbfce65490bf0ffe6c16de370280459d37e481243e29cc34fb8cb9282bbf0"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.197630 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7kkdk"] Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.198912 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" event={"ID":"7d272437-fe00-4ec0-ba5b-35e25d59ccf2","Type":"ContainerStarted","Data":"2a0ec446f718fbb5112978d8520d4a631c34ad648562e0dd6bfb6e3e74bf79e9"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.200819 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" event={"ID":"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220","Type":"ContainerStarted","Data":"932699a81f28fce7116e7a63e79aec77f5e92242b008bf75611e4401943a78dd"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.203328 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jgrjj" event={"ID":"dd243542-ca16-4b95-9fa1-b579ee3cca2e","Type":"ContainerStarted","Data":"a8c2493850e8f6ed091b3a328ca7d9563ed905fa120291536fbed89a3b427380"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.206188 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44" event={"ID":"eb9a2d0a-5c18-44d4-aa62-922d1937a7a4","Type":"ContainerStarted","Data":"d04a188a1d19888de56046b8dfc5ae19ab24430a97dc0ab7bd5553188dacc00b"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.210907 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7bjlw" event={"ID":"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6","Type":"ContainerStarted","Data":"39c2143b93c2c591d93b4302609f1c2408c48d383e898a9a16f4b1c67731da1b"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.213106 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" event={"ID":"1aa0742f-dd1e-46bf-ba75-5368a621cb89","Type":"ContainerStarted","Data":"1d295398e392b45b5639a14a96a26713e11ee42cfdae91456977687da3eb4ede"} Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.295867 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:28 crc kubenswrapper[4992]: E0131 09:27:28.296011 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:28.795992308 +0000 UTC m=+144.767384295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.296351 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:28 crc kubenswrapper[4992]: E0131 09:27:28.297469 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:28.79744812 +0000 UTC m=+144.768840107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.399486 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:28 crc kubenswrapper[4992]: E0131 09:27:28.399595 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:28.899571323 +0000 UTC m=+144.870963310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.400469 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:28 crc kubenswrapper[4992]: E0131 09:27:28.400917 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:28.900901952 +0000 UTC m=+144.872293939 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.481348 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm"] Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.503039 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:28 crc kubenswrapper[4992]: E0131 09:27:28.503674 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:29.003642553 +0000 UTC m=+144.975034540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.605194 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:28 crc kubenswrapper[4992]: E0131 09:27:28.605653 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:29.105637463 +0000 UTC m=+145.077029450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.674053 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2bwrf"] Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.678142 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7"] Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.682374 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h8jxh"] Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.700014 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt"] Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.703169 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb"] Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.709563 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:28 crc kubenswrapper[4992]: E0131 09:27:28.710003 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:29.209987001 +0000 UTC m=+145.181378988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:28 crc kubenswrapper[4992]: W0131 09:27:28.809992 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1993d67_8009_4a0c_90b2_517e1504bc2a.slice/crio-8e68878e156bedae996d4843f2e64d1ca8b528aead194ed4e1675984814c2829 WatchSource:0}: Error finding container 8e68878e156bedae996d4843f2e64d1ca8b528aead194ed4e1675984814c2829: Status 404 returned error can't find the container with id 8e68878e156bedae996d4843f2e64d1ca8b528aead194ed4e1675984814c2829 Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.810614 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:28 crc kubenswrapper[4992]: W0131 09:27:28.810930 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46b9e525_ef69_4600_92c7_8eb418824669.slice/crio-cc49a69fffc57f3513fc04d57a72eeb5be1e31f6761829dfb96a45cdde7f168a WatchSource:0}: Error finding container cc49a69fffc57f3513fc04d57a72eeb5be1e31f6761829dfb96a45cdde7f168a: Status 404 returned error can't find the container with id cc49a69fffc57f3513fc04d57a72eeb5be1e31f6761829dfb96a45cdde7f168a Jan 31 09:27:28 crc kubenswrapper[4992]: E0131 09:27:28.811065 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:29.311047984 +0000 UTC m=+145.282439971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:28 crc kubenswrapper[4992]: W0131 09:27:28.814074 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fb8fd57_5826_40cd_b62d_2a53e9e0c72c.slice/crio-825e5fd45c5fc5ae443741956f4bbb32e5c96da82d598001fcd5aa4b243cd352 WatchSource:0}: Error finding container 825e5fd45c5fc5ae443741956f4bbb32e5c96da82d598001fcd5aa4b243cd352: Status 404 returned error can't find the container with id 825e5fd45c5fc5ae443741956f4bbb32e5c96da82d598001fcd5aa4b243cd352 Jan 31 09:27:28 crc kubenswrapper[4992]: W0131 09:27:28.827184 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b61f27_cb4d_4611_a0e8_14f618385f83.slice/crio-8b37f4d6cc0983d5fc49a94cfa0b66255b02a056cf4368d2efb733990eaaa416 WatchSource:0}: Error finding container 8b37f4d6cc0983d5fc49a94cfa0b66255b02a056cf4368d2efb733990eaaa416: Status 404 returned error can't find the container with id 8b37f4d6cc0983d5fc49a94cfa0b66255b02a056cf4368d2efb733990eaaa416 Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.856705 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw"] Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.859679 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n"] Jan 31 09:27:28 crc kubenswrapper[4992]: I0131 09:27:28.912066 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:28 crc kubenswrapper[4992]: E0131 09:27:28.912256 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:29.412234599 +0000 UTC m=+145.383626576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.015552 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:29 crc kubenswrapper[4992]: E0131 09:27:29.017692 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:29.516795594 +0000 UTC m=+145.488187581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.022991 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r68rm"] Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.024942 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6"] Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.029544 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zcs9d" podStartSLOduration=122.029513937 podStartE2EDuration="2m2.029513937s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:29.022040278 +0000 UTC m=+144.993432285" watchObservedRunningTime="2026-01-31 09:27:29.029513937 +0000 UTC m=+145.000905924" Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.032811 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tq5sk"] Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.034458 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cb5lw"] Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.117029 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:29 crc kubenswrapper[4992]: E0131 09:27:29.117345 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:29.617245058 +0000 UTC m=+145.588637045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.117706 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:29 crc kubenswrapper[4992]: E0131 09:27:29.118086 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:29.618071623 +0000 UTC m=+145.589463610 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.219711 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:29 crc kubenswrapper[4992]: E0131 09:27:29.220132 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:29.720115254 +0000 UTC m=+145.691507241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.222790 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8vlmm" event={"ID":"b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6","Type":"ContainerStarted","Data":"f632767f6754aa97c02b870355225030ab520e5abdf230506e9461b048e26b13"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.235448 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-v7m78" event={"ID":"9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d","Type":"ContainerStarted","Data":"a518b8a253d973663cdd326107eae03762e51c50e4d176fc1010e1f1fa26ce49"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.260829 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" event={"ID":"65c59658-5ed8-4cef-b36d-2a1e44ec6976","Type":"ContainerStarted","Data":"7506ff95ec461360716fe2190e78dae2158766489ff856cbeb956d2e24d55980"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.293633 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" event={"ID":"42351b07-cf74-49fd-b6fd-88b7ef8fdac0","Type":"ContainerStarted","Data":"7ab41c0f05464b651a1c86b5589e7401ff8a843a151d37e96a601a5b6a14a2e4"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.313606 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" event={"ID":"3c397c46-5579-414a-aca9-3822b9e603ea","Type":"ContainerStarted","Data":"60007e6d3b815848c7fb9e8e80a70c610d4e6f3dc77a4acbe2c99127fbfe39aa"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.313838 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.314998 4992 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bcggv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.315048 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" podUID="3c397c46-5579-414a-aca9-3822b9e603ea" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.315738 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt" event={"ID":"4f8fe86e-1b21-4358-aa22-4c0939d313f7","Type":"ContainerStarted","Data":"8ffee503cdb7c9068d0a97f0696f65141327dabd98d2a1650d1e62a114372c58"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.317283 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm" event={"ID":"3a924bba-57c2-4c3b-9560-ab10bed041cf","Type":"ContainerStarted","Data":"e71a05d25cb54a0e8e5d6829025dccbe22b839a2656974b64c6c00a2cda5d925"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.320817 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb" event={"ID":"24b61f27-cb4d-4611-a0e8-14f618385f83","Type":"ContainerStarted","Data":"8b37f4d6cc0983d5fc49a94cfa0b66255b02a056cf4368d2efb733990eaaa416"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.321392 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:29 crc kubenswrapper[4992]: E0131 09:27:29.321918 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:29.821903117 +0000 UTC m=+145.793295104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.323102 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h8jxh" event={"ID":"9fb8fd57-5826-40cd-b62d-2a53e9e0c72c","Type":"ContainerStarted","Data":"825e5fd45c5fc5ae443741956f4bbb32e5c96da82d598001fcd5aa4b243cd352"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.324948 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7kkdk" event={"ID":"a364be49-d097-438f-858d-77e2bcff5ad0","Type":"ContainerStarted","Data":"1efdb3e86be029f2f3e8396b4f9808649fcb93ab629d851dbc1f8d6684eb4760"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.336335 4992 generic.go:334] "Generic (PLEG): container finished" podID="fd0c1366-dfbf-487e-98a4-94fb4be75045" containerID="882f2f6ca98c739222008605ab5cb4b9fbce788acd60730fad06babc4f6cf433" exitCode=0 Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.336491 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" event={"ID":"fd0c1366-dfbf-487e-98a4-94fb4be75045","Type":"ContainerDied","Data":"882f2f6ca98c739222008605ab5cb4b9fbce788acd60730fad06babc4f6cf433"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.338291 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2bwrf" event={"ID":"c1993d67-8009-4a0c-90b2-517e1504bc2a","Type":"ContainerStarted","Data":"8e68878e156bedae996d4843f2e64d1ca8b528aead194ed4e1675984814c2829"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.343588 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7" event={"ID":"46b9e525-ef69-4600-92c7-8eb418824669","Type":"ContainerStarted","Data":"cc49a69fffc57f3513fc04d57a72eeb5be1e31f6761829dfb96a45cdde7f168a"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.353058 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" event={"ID":"f4ab691e-904d-49b1-9b3e-57a8271bd791","Type":"ContainerStarted","Data":"e7f30308e6f693f804b9f8f61eecf1932122f05fc4b81297ab9ae921b5f5efd7"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.354441 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8" event={"ID":"90220a85-3f5b-4360-9cc9-d9c9a65db928","Type":"ContainerStarted","Data":"b8ccd9b0553e1eb065f3ac9b3ef167e6c3a0d633681be3b4bf73fab29c323c88"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.355113 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5564v" event={"ID":"3f216997-1de4-499d-b5f2-0bacbbdbdd36","Type":"ContainerStarted","Data":"a04e0db097432a593d755b221a5c3ca05b0d6327c67a4bbdb1ffe821956c0287"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.366248 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" event={"ID":"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220","Type":"ContainerStarted","Data":"5ff37b9426106bb1df8796a01750e6bac347fb5a39840af90789468ee82dd76a"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.375072 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jgrjj" event={"ID":"dd243542-ca16-4b95-9fa1-b579ee3cca2e","Type":"ContainerStarted","Data":"25eeff866992b77a0acc0b99d7fc52c1e7f84a70bfddcdc55186a0e4c6ceab85"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.376366 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" event={"ID":"fbfb9fcb-64ca-4640-a230-4212004d2494","Type":"ContainerStarted","Data":"6f29e77b1037bf820eec7eaed932cba358b5b17c6c068a29bdd3ec81c8554058"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.377958 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp" event={"ID":"545db117-1eda-49e5-96e5-223285792b1c","Type":"ContainerStarted","Data":"572650163b5e73e5a25d3e9730e389744f7e79ace67c72c4bcb85fa5ec84d602"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.386768 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" event={"ID":"24d2a857-eb20-4eb7-acb2-077e53af8b03","Type":"ContainerStarted","Data":"85167e40beeab688fe91317e6d21b5c1e39390735f343d871ae1ff97fefdb2c7"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.387947 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tq5sk" event={"ID":"0a88cb09-555e-4fd5-9ffc-fcff02f2bf35","Type":"ContainerStarted","Data":"4fb424f593434bb3679957fe81b68645b0de4f3a8b1afe5f0fe57545ce1a778e"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.388760 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" event={"ID":"82abaea9-9d21-432e-a434-d21fe6a7197b","Type":"ContainerStarted","Data":"712a0aebc1bc3f3c03df0dfebda2a3a999b26e7429aef293abebf94eca2f6e5d"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.389506 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9hqbq" event={"ID":"5eb9b7a9-c9b5-4b31-9773-dee9b786ba91","Type":"ContainerStarted","Data":"0cb3cd659bc07e0236ea404e4e035b576fb82859107ee2c9bfb6c9e5d2547b30"} Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.422912 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:29 crc kubenswrapper[4992]: E0131 09:27:29.423574 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:29.923550117 +0000 UTC m=+145.894942114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.459698 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-8vlmm" podStartSLOduration=122.459669245 podStartE2EDuration="2m2.459669245s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:29.454277397 +0000 UTC m=+145.425669414" watchObservedRunningTime="2026-01-31 09:27:29.459669245 +0000 UTC m=+145.431061232" Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.490660 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" podStartSLOduration=122.490642903 podStartE2EDuration="2m2.490642903s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:29.48884058 +0000 UTC m=+145.460232587" watchObservedRunningTime="2026-01-31 09:27:29.490642903 +0000 UTC m=+145.462034890" Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.534968 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:29 crc kubenswrapper[4992]: E0131 09:27:29.536414 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:30.036400524 +0000 UTC m=+146.007792511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.578604 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-wrmqz" podStartSLOduration=122.578580501 podStartE2EDuration="2m2.578580501s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:29.533372726 +0000 UTC m=+145.504764733" watchObservedRunningTime="2026-01-31 09:27:29.578580501 +0000 UTC m=+145.549972488" Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.636971 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:29 crc kubenswrapper[4992]: E0131 09:27:29.637383 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:30.137366564 +0000 UTC m=+146.108758551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.738159 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:29 crc kubenswrapper[4992]: E0131 09:27:29.738663 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:30.238642512 +0000 UTC m=+146.210034499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.840826 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:29 crc kubenswrapper[4992]: E0131 09:27:29.841069 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:30.341041894 +0000 UTC m=+146.312433881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.841688 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:29 crc kubenswrapper[4992]: E0131 09:27:29.842633 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:30.3426146 +0000 UTC m=+146.314006587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.941952 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.942234 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:29 crc kubenswrapper[4992]: E0131 09:27:29.942649 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:30.442635492 +0000 UTC m=+146.414027479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.942969 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Jan 31 09:27:29 crc kubenswrapper[4992]: I0131 09:27:29.943005 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.043190 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:30 crc kubenswrapper[4992]: E0131 09:27:30.043880 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:30.543861389 +0000 UTC m=+146.515253376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.144684 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:30 crc kubenswrapper[4992]: E0131 09:27:30.144967 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:30.644951922 +0000 UTC m=+146.616343909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.246975 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:30 crc kubenswrapper[4992]: E0131 09:27:30.247525 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:30.747500038 +0000 UTC m=+146.718892025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.348660 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:30 crc kubenswrapper[4992]: E0131 09:27:30.349091 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:30.849072285 +0000 UTC m=+146.820464272 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.395933 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-68zwk" event={"ID":"6e0e4407-bfda-4d16-9e9e-d9065286a07d","Type":"ContainerStarted","Data":"830e93e59e4c55db3f12c1c84f7a427cd4fe6002b252f6b0b6fd8e68c318014f"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.397089 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6nssv" event={"ID":"3bfb568a-6ad7-41fd-86da-7aaf96ecd991","Type":"ContainerStarted","Data":"cdd8f0965f8e2e9609dbf8266bf67a7742bde0af14f7fef27155b3649fae7c5d"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.398583 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.399935 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n" event={"ID":"047a41f9-3608-40a0-a1a2-ccdde5061412","Type":"ContainerStarted","Data":"8a7d55f103713bdd0070bbd7c5ebc0f787d3d02638ae91f1fd378d69455157ac"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.400574 4992 patch_prober.go:28] interesting pod/console-operator-58897d9998-6nssv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.400626 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6nssv" podUID="3bfb568a-6ad7-41fd-86da-7aaf96ecd991" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.401609 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-np7kv" event={"ID":"e7edcde3-ebb8-4a50-a75f-28539482b78b","Type":"ContainerStarted","Data":"6f660d57286032cc67fbe6773522904f09ea07fb7b1c9cb5d66cf63629e85ebf"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.402747 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" event={"ID":"1aa0742f-dd1e-46bf-ba75-5368a621cb89","Type":"ContainerStarted","Data":"b47d7d5c08450dc3caa815347e6a0f882b14074139e8906e5bceb71d7d532bca"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.406725 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7kkdk" event={"ID":"a364be49-d097-438f-858d-77e2bcff5ad0","Type":"ContainerStarted","Data":"be2420bcf8f06d2626cf34024cf62a74b131ff556ee8e7c7ece415af72aeef63"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.407795 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" event={"ID":"82abaea9-9d21-432e-a434-d21fe6a7197b","Type":"ContainerStarted","Data":"231f247f7a2dd884b4e58d4c6cc221fb9dfe0e7e18a8c74a10c00c31e8e329a2"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.408888 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7" event={"ID":"46b9e525-ef69-4600-92c7-8eb418824669","Type":"ContainerStarted","Data":"7137845fb214c24994adb7a1b7ecb5453211816c7776ff50da29ea5a46266241"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.409762 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9hqbq" event={"ID":"5eb9b7a9-c9b5-4b31-9773-dee9b786ba91","Type":"ContainerStarted","Data":"d60faf06679bf84f5c0b66dc4a2ccb8f61a65a03cf4467b28e6ec75987e17efa"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.411507 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-v7m78" event={"ID":"9e8f83f8-4c88-4b0a-acd2-bb7ee606c63d","Type":"ContainerStarted","Data":"66cbca9f725caa65374620959ddcd749ec604eeceaf1b9403bfd75f4dffd4028"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.413231 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" event={"ID":"7425a945-4499-4a87-b745-d31e5dbf9d0e","Type":"ContainerStarted","Data":"5df7c2e5c0952bb108a928648720e38c0ff568dd805ad7b132fcf0683593b2e7"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.414211 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44" event={"ID":"eb9a2d0a-5c18-44d4-aa62-922d1937a7a4","Type":"ContainerStarted","Data":"9adf6776d9a34392d8efba06cb1d2af15a672210d868a1c75249ec0b6460ba0b"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.415261 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" event={"ID":"f4ab691e-904d-49b1-9b3e-57a8271bd791","Type":"ContainerStarted","Data":"6ff70cb530b7d23359e7563e7cfe9f523b986fd51982d4e6f8eca904c85001b1"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.416319 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt" event={"ID":"21c6aa89-3d4a-4afe-b9bd-d55e5ff9ef66","Type":"ContainerStarted","Data":"0fd09f2a33eef139ac656d964fb5adfbdb36ab03fe7bb4189d66e6785f5bf24a"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.418626 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f7qlp" podStartSLOduration=123.418614013 podStartE2EDuration="2m3.418614013s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:29.580527988 +0000 UTC m=+145.551919995" watchObservedRunningTime="2026-01-31 09:27:30.418614013 +0000 UTC m=+146.390006000" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.418686 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8" event={"ID":"90220a85-3f5b-4360-9cc9-d9c9a65db928","Type":"ContainerStarted","Data":"bd94fb1d065e3b85c0bd45ae4256970313e012bcde840921974ee758941958fd"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.419198 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-6nssv" podStartSLOduration=123.41919361 podStartE2EDuration="2m3.41919361s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:30.417877962 +0000 UTC m=+146.389270019" watchObservedRunningTime="2026-01-31 09:27:30.41919361 +0000 UTC m=+146.390585607" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.419992 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm" event={"ID":"3a924bba-57c2-4c3b-9560-ab10bed041cf","Type":"ContainerStarted","Data":"42c0758c519723624d5a1f070852915aa9ef5362043a02a22e78e8c270479d22"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.433712 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h8jxh" event={"ID":"9fb8fd57-5826-40cd-b62d-2a53e9e0c72c","Type":"ContainerStarted","Data":"2a252fc2b242c1b5cc228bbe4701f6106e51838a849982ae34c2ef371473dace"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.435200 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb" event={"ID":"24b61f27-cb4d-4611-a0e8-14f618385f83","Type":"ContainerStarted","Data":"c0492bc53eedb3bb45eb7e4156e90283babd886a8000d085cc6680513c3273e7"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.439465 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" event={"ID":"e86d38f0-15ae-4043-a550-54cae8cf4e8d","Type":"ContainerStarted","Data":"bd391c6c4bb6650bab3dd550223343dcfd2e19b2ca126ac99d5bbb1daf8eaf32"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.441883 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" event={"ID":"9398e230-5d76-4418-9807-be17513913c0","Type":"ContainerStarted","Data":"f68bdc668a813c540236befb4784aad91de3078d0151efad3d06dccd992c20d8"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.442508 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.443474 4992 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-45qvp container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.443509 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" podUID="9398e230-5d76-4418-9807-be17513913c0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.444092 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" event={"ID":"db3860c3-37de-4fa5-9c79-965abd0e2149","Type":"ContainerStarted","Data":"be4d9b0214643318f15628b89023e7e6feced132f4eae4a71b8e1444baf68b8a"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.445493 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.445717 4992 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-vktdq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" start-of-body= Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.445752 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.448286 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2m8dn" podStartSLOduration=123.448272113 podStartE2EDuration="2m3.448272113s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:30.446217742 +0000 UTC m=+146.417609729" watchObservedRunningTime="2026-01-31 09:27:30.448272113 +0000 UTC m=+146.419664100" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.452117 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5564v" event={"ID":"3f216997-1de4-499d-b5f2-0bacbbdbdd36","Type":"ContainerStarted","Data":"680a4e9d9d0d434b2f8a9bea25abb74955c029f5aafe35d6ac059229e202aa5e"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.454261 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" event={"ID":"65c59658-5ed8-4cef-b36d-2a1e44ec6976","Type":"ContainerStarted","Data":"a5f07f12e53bc482271dbac8b5d3aec9e8654c73d41a2db7e4becbb382281eea"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.461978 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7bjlw" event={"ID":"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6","Type":"ContainerStarted","Data":"7a6a719b97d93ae745ef222ca7c1cafccb483e3f059bc60fc1cd386ff87cc5ae"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.463458 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-np7kv" podStartSLOduration=6.463439257 podStartE2EDuration="6.463439257s" podCreationTimestamp="2026-01-31 09:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:30.461069218 +0000 UTC m=+146.432461225" watchObservedRunningTime="2026-01-31 09:27:30.463439257 +0000 UTC m=+146.434831244" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.466775 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" event={"ID":"b7269408-3cf0-468a-a5d4-2625ff71b408","Type":"ContainerStarted","Data":"f09bc67bb8e391ca1b1271a1062b1b3f3f89c0038ff0f2674109d95c46ffcd57"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.468109 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n" event={"ID":"58e3fff7-b9d0-4107-96ae-a0a00965f574","Type":"ContainerStarted","Data":"f497e1133fad8b0196591ad6fdff6bd3825b8b8dd5b9aa4726413adf34ea44f0"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.474501 4992 generic.go:334] "Generic (PLEG): container finished" podID="7d272437-fe00-4ec0-ba5b-35e25d59ccf2" containerID="2c7c557351d13c92536ef7b6d119cc384e2042c147002fcd317ea8a1beccbddc" exitCode=0 Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.474968 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" event={"ID":"7d272437-fe00-4ec0-ba5b-35e25d59ccf2","Type":"ContainerDied","Data":"2c7c557351d13c92536ef7b6d119cc384e2042c147002fcd317ea8a1beccbddc"} Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.477816 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-jgrjj" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.477843 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.477902 4992 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-srdfh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.477929 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" podUID="f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.477935 4992 patch_prober.go:28] interesting pod/downloads-7954f5f757-jgrjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.477959 4992 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bcggv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.477989 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" podUID="3c397c46-5579-414a-aca9-3822b9e603ea" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.477965 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jgrjj" podUID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.497032 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:30 crc kubenswrapper[4992]: E0131 09:27:30.497895 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:30.997880007 +0000 UTC m=+146.969271994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.501361 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-v7r7n" podStartSLOduration=123.501348348 podStartE2EDuration="2m3.501348348s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:30.500772271 +0000 UTC m=+146.472164278" watchObservedRunningTime="2026-01-31 09:27:30.501348348 +0000 UTC m=+146.472740335" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.501891 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9hqbq" podStartSLOduration=6.501885364 podStartE2EDuration="6.501885364s" podCreationTimestamp="2026-01-31 09:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:30.484940807 +0000 UTC m=+146.456332794" watchObservedRunningTime="2026-01-31 09:27:30.501885364 +0000 UTC m=+146.473277351" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.541021 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-v7m78" podStartSLOduration=123.541006131 podStartE2EDuration="2m3.541006131s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:30.538436395 +0000 UTC m=+146.509828382" watchObservedRunningTime="2026-01-31 09:27:30.541006131 +0000 UTC m=+146.512398118" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.541430 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lmbqt" podStartSLOduration=123.541408732 podStartE2EDuration="2m3.541408732s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:30.521767957 +0000 UTC m=+146.493159964" watchObservedRunningTime="2026-01-31 09:27:30.541408732 +0000 UTC m=+146.512800719" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.556175 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" podStartSLOduration=123.556159225 podStartE2EDuration="2m3.556159225s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:30.554845776 +0000 UTC m=+146.526237773" watchObservedRunningTime="2026-01-31 09:27:30.556159225 +0000 UTC m=+146.527551212" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.578950 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" podStartSLOduration=123.578930542 podStartE2EDuration="2m3.578930542s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:30.576290385 +0000 UTC m=+146.547682392" watchObservedRunningTime="2026-01-31 09:27:30.578930542 +0000 UTC m=+146.550322549" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.599125 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:30 crc kubenswrapper[4992]: E0131 09:27:30.599583 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:31.099564887 +0000 UTC m=+147.070956874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.601869 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:30 crc kubenswrapper[4992]: E0131 09:27:30.608472 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:31.108450278 +0000 UTC m=+147.079842345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.651299 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-jgrjj" podStartSLOduration=123.651283403 podStartE2EDuration="2m3.651283403s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:30.624288482 +0000 UTC m=+146.595680479" watchObservedRunningTime="2026-01-31 09:27:30.651283403 +0000 UTC m=+146.622675390" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.651879 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-7bjlw" podStartSLOduration=123.65187447 podStartE2EDuration="2m3.65187447s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:30.651679905 +0000 UTC m=+146.623071912" watchObservedRunningTime="2026-01-31 09:27:30.65187447 +0000 UTC m=+146.623266457" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.684036 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jr8pm" podStartSLOduration=123.684012052 podStartE2EDuration="2m3.684012052s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:30.67951458 +0000 UTC m=+146.650906587" watchObservedRunningTime="2026-01-31 09:27:30.684012052 +0000 UTC m=+146.655404049" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.704901 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:30 crc kubenswrapper[4992]: E0131 09:27:30.706501 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:31.20646249 +0000 UTC m=+147.177854477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.708737 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zvnpr" podStartSLOduration=125.708700516 podStartE2EDuration="2m5.708700516s" podCreationTimestamp="2026-01-31 09:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:30.702879005 +0000 UTC m=+146.674270992" watchObservedRunningTime="2026-01-31 09:27:30.708700516 +0000 UTC m=+146.680092523" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.768114 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" podStartSLOduration=123.768080116 podStartE2EDuration="2m3.768080116s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:30.76071297 +0000 UTC m=+146.732104977" watchObservedRunningTime="2026-01-31 09:27:30.768080116 +0000 UTC m=+146.739472103" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.794180 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bzb6n" podStartSLOduration=123.794153911 podStartE2EDuration="2m3.794153911s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:30.791047 +0000 UTC m=+146.762438987" watchObservedRunningTime="2026-01-31 09:27:30.794153911 +0000 UTC m=+146.765545918" Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.806357 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:30 crc kubenswrapper[4992]: E0131 09:27:30.806726 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:31.306713119 +0000 UTC m=+147.278105106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.906904 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:30 crc kubenswrapper[4992]: E0131 09:27:30.907668 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:31.407653217 +0000 UTC m=+147.379045204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.940998 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Jan 31 09:27:30 crc kubenswrapper[4992]: I0131 09:27:30.941063 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.008795 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:31 crc kubenswrapper[4992]: E0131 09:27:31.009264 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:31.509243935 +0000 UTC m=+147.480635922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.109597 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:31 crc kubenswrapper[4992]: E0131 09:27:31.109725 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:31.60969917 +0000 UTC m=+147.581091157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.109807 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:31 crc kubenswrapper[4992]: E0131 09:27:31.110122 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:31.610115152 +0000 UTC m=+147.581507139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.210489 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:31 crc kubenswrapper[4992]: E0131 09:27:31.210589 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:31.710571106 +0000 UTC m=+147.681963093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.210708 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:31 crc kubenswrapper[4992]: E0131 09:27:31.211067 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:31.71105593 +0000 UTC m=+147.682447917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.312302 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:31 crc kubenswrapper[4992]: E0131 09:27:31.312587 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:31.812557516 +0000 UTC m=+147.783949503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.312840 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:31 crc kubenswrapper[4992]: E0131 09:27:31.313174 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:31.813156103 +0000 UTC m=+147.784548090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.413817 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:31 crc kubenswrapper[4992]: E0131 09:27:31.414017 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:31.913989059 +0000 UTC m=+147.885381066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.482281 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" event={"ID":"b7269408-3cf0-468a-a5d4-2625ff71b408","Type":"ContainerStarted","Data":"012fb7d6d54b365a62539ecc0f5bc93b5867edf7751cf69f152f4835ba88127a"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.485843 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" event={"ID":"fbfb9fcb-64ca-4640-a230-4212004d2494","Type":"ContainerStarted","Data":"7c6a5aa752573565c4cf85bfc9f8d36cee2c9069ba3c3f1fbfef06b64082c986"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.487138 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.488465 4992 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kc7bw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" start-of-body= Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.488524 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" podUID="fbfb9fcb-64ca-4640-a230-4212004d2494" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.489320 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8" event={"ID":"90220a85-3f5b-4360-9cc9-d9c9a65db928","Type":"ContainerStarted","Data":"e3b479cb692b29334196745d60f18f8ce151a2864d339f12af206aa8bb031616"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.489687 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.490980 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" event={"ID":"42351b07-cf74-49fd-b6fd-88b7ef8fdac0","Type":"ContainerStarted","Data":"b768c51c7370bf0a5e2863e9d4fe57a03fab7b1bcfb19617106bd81eb457d9f1"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.491486 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.493119 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44" event={"ID":"eb9a2d0a-5c18-44d4-aa62-922d1937a7a4","Type":"ContainerStarted","Data":"92c2cd3ad477d9cfb842e97128f7ffe9153c1ceab17b1ccce2ffcb7c92f3cca8"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.493320 4992 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-r68rm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.493369 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" podUID="42351b07-cf74-49fd-b6fd-88b7ef8fdac0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.494526 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt" event={"ID":"4f8fe86e-1b21-4358-aa22-4c0939d313f7","Type":"ContainerStarted","Data":"e709ebd47c5db57e10105f02acbfe22bdf30b84e6aa9992912fed6f88bc41b7f"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.497142 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7kkdk" event={"ID":"a364be49-d097-438f-858d-77e2bcff5ad0","Type":"ContainerStarted","Data":"be9afd7bc1cbb374516174b75ba5844c62dcbbbfc263939485e24b9cf0e12e2d"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.500028 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7" event={"ID":"46b9e525-ef69-4600-92c7-8eb418824669","Type":"ContainerStarted","Data":"71e3d1b1dbca57135b9d4ed650059d371091d895cfdc22be2cde4b8c80a922a5"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.503328 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" event={"ID":"7d272437-fe00-4ec0-ba5b-35e25d59ccf2","Type":"ContainerStarted","Data":"0956f5e3deb0335b1169fa4b3b3b93bac1404326c6ea860056b610815e73d55c"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.505355 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2bwrf" event={"ID":"c1993d67-8009-4a0c-90b2-517e1504bc2a","Type":"ContainerStarted","Data":"e059bd0ada31b39ec9c984175f3593d4d9ae7e2e579852eda92b34b6873b2490"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.505389 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2bwrf" event={"ID":"c1993d67-8009-4a0c-90b2-517e1504bc2a","Type":"ContainerStarted","Data":"999fd7eab22d7456145b84997e4164459bcf488fadb27236a74b45ceac84cecc"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.506307 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-2bwrf" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.508912 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" event={"ID":"fd0c1366-dfbf-487e-98a4-94fb4be75045","Type":"ContainerStarted","Data":"dfef1b4c8de530fe171338c97a5d7c35e18f29ed8f015f224bd136485c3d6c64"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.509947 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.514449 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-68zwk" event={"ID":"6e0e4407-bfda-4d16-9e9e-d9065286a07d","Type":"ContainerStarted","Data":"baba940fa7b8ba3557d06ca9fa02bcd38ee8ed2fc4f1ad76a2513e072489d0c8"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.515878 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:31 crc kubenswrapper[4992]: E0131 09:27:31.516500 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:32.016479193 +0000 UTC m=+147.987871180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.528335 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5564v" event={"ID":"3f216997-1de4-499d-b5f2-0bacbbdbdd36","Type":"ContainerStarted","Data":"cacc0ea1069e68873edd050ef79c99e24f197c254b03134b8b89657ee5e8a7c5"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.536811 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" event={"ID":"7425a945-4499-4a87-b745-d31e5dbf9d0e","Type":"ContainerStarted","Data":"b206e578f450c9a6fc65a6b17a4c3187e9901afd4d8aef5022df2787ea536afe"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.538888 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tq5sk" event={"ID":"0a88cb09-555e-4fd5-9ffc-fcff02f2bf35","Type":"ContainerStarted","Data":"27fb6cb3d4f1ed2e9d6b7f3a057a2a0bbdb1860053635e5658f2f35f82b595a2"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.538935 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tq5sk" event={"ID":"0a88cb09-555e-4fd5-9ffc-fcff02f2bf35","Type":"ContainerStarted","Data":"2b0db0a8677a16c8741a8a507494068cba25867870aac8e57310e740f0a2e19a"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.544446 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" event={"ID":"f4ab691e-904d-49b1-9b3e-57a8271bd791","Type":"ContainerStarted","Data":"67ce3cbe8a0838c7a13f0d68df268b0fbdcd1b1fd38d37a0f9079e0dd6cbbbe6"} Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.555689 4992 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-srdfh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.555754 4992 patch_prober.go:28] interesting pod/console-operator-58897d9998-6nssv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.555872 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6nssv" podUID="3bfb568a-6ad7-41fd-86da-7aaf96ecd991" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.555764 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" podUID="f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.555689 4992 patch_prober.go:28] interesting pod/downloads-7954f5f757-jgrjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.555971 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jgrjj" podUID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.556822 4992 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-45qvp container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.556852 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" podUID="9398e230-5d76-4418-9807-be17513913c0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.557093 4992 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-vktdq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" start-of-body= Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.557120 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.557557 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.561581 4992 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-25wn6 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.561661 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" podUID="82abaea9-9d21-432e-a434-d21fe6a7197b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.586738 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.586810 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.588135 4992 patch_prober.go:28] interesting pod/apiserver-76f77b778f-68zwk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.588204 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-68zwk" podUID="6e0e4407-bfda-4d16-9e9e-d9065286a07d" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.590782 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-68zwk" podStartSLOduration=124.59075712 podStartE2EDuration="2m4.59075712s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.589690599 +0000 UTC m=+147.561082596" watchObservedRunningTime="2026-01-31 09:27:31.59075712 +0000 UTC m=+147.562149107" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.593947 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxddh" podStartSLOduration=124.593915403 podStartE2EDuration="2m4.593915403s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.518404959 +0000 UTC m=+147.489796976" watchObservedRunningTime="2026-01-31 09:27:31.593915403 +0000 UTC m=+147.565307390" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.617042 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:31 crc kubenswrapper[4992]: E0131 09:27:31.617223 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:32.117196255 +0000 UTC m=+148.088588242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.617339 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:31 crc kubenswrapper[4992]: E0131 09:27:31.617773 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:32.117751801 +0000 UTC m=+148.089143778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.636363 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" podStartSLOduration=124.636343296 podStartE2EDuration="2m4.636343296s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.634177203 +0000 UTC m=+147.605569190" watchObservedRunningTime="2026-01-31 09:27:31.636343296 +0000 UTC m=+147.607735283" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.665363 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dntz7" podStartSLOduration=124.665344896 podStartE2EDuration="2m4.665344896s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.663974556 +0000 UTC m=+147.635366563" watchObservedRunningTime="2026-01-31 09:27:31.665344896 +0000 UTC m=+147.636736883" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.684080 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" podStartSLOduration=124.684049895 podStartE2EDuration="2m4.684049895s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.681109908 +0000 UTC m=+147.652501925" watchObservedRunningTime="2026-01-31 09:27:31.684049895 +0000 UTC m=+147.655441892" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.716165 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8" podStartSLOduration=124.716144435 podStartE2EDuration="2m4.716144435s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.714292321 +0000 UTC m=+147.685684308" watchObservedRunningTime="2026-01-31 09:27:31.716144435 +0000 UTC m=+147.687536422" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.721136 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:31 crc kubenswrapper[4992]: E0131 09:27:31.721311 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:32.221282876 +0000 UTC m=+148.192674863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.726716 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:31 crc kubenswrapper[4992]: E0131 09:27:31.730752 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:32.230735793 +0000 UTC m=+148.202127780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.759611 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-45rxt" podStartSLOduration=124.759584228 podStartE2EDuration="2m4.759584228s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.749767081 +0000 UTC m=+147.721159098" watchObservedRunningTime="2026-01-31 09:27:31.759584228 +0000 UTC m=+147.730976215" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.768067 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44" podStartSLOduration=124.768047657 podStartE2EDuration="2m4.768047657s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.765476431 +0000 UTC m=+147.736868438" watchObservedRunningTime="2026-01-31 09:27:31.768047657 +0000 UTC m=+147.739439644" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.784290 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-7kkdk" podStartSLOduration=124.784269302 podStartE2EDuration="2m4.784269302s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.781362827 +0000 UTC m=+147.752754834" watchObservedRunningTime="2026-01-31 09:27:31.784269302 +0000 UTC m=+147.755661289" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.803504 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" podStartSLOduration=124.803488325 podStartE2EDuration="2m4.803488325s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.801622901 +0000 UTC m=+147.773014898" watchObservedRunningTime="2026-01-31 09:27:31.803488325 +0000 UTC m=+147.774880312" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.828959 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:31 crc kubenswrapper[4992]: E0131 09:27:31.829567 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:32.329548209 +0000 UTC m=+148.300940196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.829628 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2bwrf" podStartSLOduration=7.829613221 podStartE2EDuration="7.829613221s" podCreationTimestamp="2026-01-31 09:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.823473711 +0000 UTC m=+147.794865718" watchObservedRunningTime="2026-01-31 09:27:31.829613221 +0000 UTC m=+147.801005238" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.846940 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" podStartSLOduration=124.846923099 podStartE2EDuration="2m4.846923099s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.845546348 +0000 UTC m=+147.816938355" watchObservedRunningTime="2026-01-31 09:27:31.846923099 +0000 UTC m=+147.818315096" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.875675 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-tq5sk" podStartSLOduration=124.875652961 podStartE2EDuration="2m4.875652961s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.875486926 +0000 UTC m=+147.846878913" watchObservedRunningTime="2026-01-31 09:27:31.875652961 +0000 UTC m=+147.847044948" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.896349 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.896388 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.897789 4992 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-2fk8c container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.898100 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" podUID="7d272437-fe00-4ec0-ba5b-35e25d59ccf2" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.926536 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-glkns" podStartSLOduration=124.926516871 podStartE2EDuration="2m4.926516871s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.924303267 +0000 UTC m=+147.895695274" watchObservedRunningTime="2026-01-31 09:27:31.926516871 +0000 UTC m=+147.897908858" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.927871 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5564v" podStartSLOduration=124.927863041 podStartE2EDuration="2m4.927863041s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.897973175 +0000 UTC m=+147.869365182" watchObservedRunningTime="2026-01-31 09:27:31.927863041 +0000 UTC m=+147.899255028" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.930226 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:31 crc kubenswrapper[4992]: E0131 09:27:31.930704 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:32.430689234 +0000 UTC m=+148.402081221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.948210 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:31 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:31 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:31 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.948275 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.979651 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ddxxb" podStartSLOduration=124.979632087 podStartE2EDuration="2m4.979632087s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.959955091 +0000 UTC m=+147.931347108" watchObservedRunningTime="2026-01-31 09:27:31.979632087 +0000 UTC m=+147.951024074" Jan 31 09:27:31 crc kubenswrapper[4992]: I0131 09:27:31.979805 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8r6k8" podStartSLOduration=124.979800992 podStartE2EDuration="2m4.979800992s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:31.976034562 +0000 UTC m=+147.947426559" watchObservedRunningTime="2026-01-31 09:27:31.979800992 +0000 UTC m=+147.951192979" Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.019089 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" podStartSLOduration=126.019070303 podStartE2EDuration="2m6.019070303s" podCreationTimestamp="2026-01-31 09:25:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:32.01794163 +0000 UTC m=+147.989333627" watchObservedRunningTime="2026-01-31 09:27:32.019070303 +0000 UTC m=+147.990462290" Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.031535 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:32 crc kubenswrapper[4992]: E0131 09:27:32.032346 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:32.532320502 +0000 UTC m=+148.503712489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.038961 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" podStartSLOduration=125.038941466 podStartE2EDuration="2m5.038941466s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:32.036735121 +0000 UTC m=+148.008127118" watchObservedRunningTime="2026-01-31 09:27:32.038941466 +0000 UTC m=+148.010333463" Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.133617 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:32 crc kubenswrapper[4992]: E0131 09:27:32.134144 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:32.634123426 +0000 UTC m=+148.605515413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.234139 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:32 crc kubenswrapper[4992]: E0131 09:27:32.234304 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:32.734272781 +0000 UTC m=+148.705664768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.234709 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:32 crc kubenswrapper[4992]: E0131 09:27:32.235031 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:32.735020573 +0000 UTC m=+148.706412560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.335385 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:32 crc kubenswrapper[4992]: E0131 09:27:32.335552 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:32.835525989 +0000 UTC m=+148.806917976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.335603 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:32 crc kubenswrapper[4992]: E0131 09:27:32.336052 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:32.836036834 +0000 UTC m=+148.807428831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.436281 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:32 crc kubenswrapper[4992]: E0131 09:27:32.436664 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:32.936617272 +0000 UTC m=+148.908009249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.537323 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:32 crc kubenswrapper[4992]: E0131 09:27:32.537724 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.037709255 +0000 UTC m=+149.009101242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.560219 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" event={"ID":"24d2a857-eb20-4eb7-acb2-077e53af8b03","Type":"ContainerStarted","Data":"155b2c53eac2917418211f233a7d38f91439ee41994101d7e1d19cfd99a7b6a4"} Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.561298 4992 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-r68rm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.561350 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" podUID="42351b07-cf74-49fd-b6fd-88b7ef8fdac0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.561698 4992 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-25wn6 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.561767 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" podUID="82abaea9-9d21-432e-a434-d21fe6a7197b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.561911 4992 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kc7bw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" start-of-body= Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.561937 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" podUID="fbfb9fcb-64ca-4640-a230-4212004d2494" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.562163 4992 patch_prober.go:28] interesting pod/console-operator-58897d9998-6nssv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.562213 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6nssv" podUID="3bfb568a-6ad7-41fd-86da-7aaf96ecd991" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.581043 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-45qvp" Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.611360 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h8jxh" podStartSLOduration=125.611339263 podStartE2EDuration="2m5.611339263s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:32.067187364 +0000 UTC m=+148.038579371" watchObservedRunningTime="2026-01-31 09:27:32.611339263 +0000 UTC m=+148.582731250" Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.638730 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:32 crc kubenswrapper[4992]: E0131 09:27:32.639459 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.139436807 +0000 UTC m=+149.110828794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.740319 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:32 crc kubenswrapper[4992]: E0131 09:27:32.740802 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.240781848 +0000 UTC m=+149.212173895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.849468 4992 csr.go:261] certificate signing request csr-hv2bq is approved, waiting to be issued Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.850799 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:32 crc kubenswrapper[4992]: E0131 09:27:32.851217 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.351200804 +0000 UTC m=+149.322592791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.859649 4992 csr.go:257] certificate signing request csr-hv2bq is issued Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.946468 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:32 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:32 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:32 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.946562 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:32 crc kubenswrapper[4992]: I0131 09:27:32.951949 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:32 crc kubenswrapper[4992]: E0131 09:27:32.952353 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.452326958 +0000 UTC m=+149.423718945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.052826 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:33 crc kubenswrapper[4992]: E0131 09:27:33.053050 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.55301875 +0000 UTC m=+149.524410747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.054171 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:33 crc kubenswrapper[4992]: E0131 09:27:33.054589 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.554573005 +0000 UTC m=+149.525964992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.157011 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:33 crc kubenswrapper[4992]: E0131 09:27:33.157715 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.657696138 +0000 UTC m=+149.629088125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.258149 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:33 crc kubenswrapper[4992]: E0131 09:27:33.258727 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.758708059 +0000 UTC m=+149.730100046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.359529 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:33 crc kubenswrapper[4992]: E0131 09:27:33.359983 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.859952586 +0000 UTC m=+149.831344583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.460718 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.460775 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.460808 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.460854 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.460878 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:27:33 crc kubenswrapper[4992]: E0131 09:27:33.461620 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:33.961580855 +0000 UTC m=+149.932972842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.462448 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.467375 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.467694 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.468189 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.505700 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.522676 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.543743 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.563166 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:33 crc kubenswrapper[4992]: E0131 09:27:33.563882 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:34.063860673 +0000 UTC m=+150.035252660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.572799 4992 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-vktdq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.572896 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.668864 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:33 crc kubenswrapper[4992]: E0131 09:27:33.675478 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:34.175396192 +0000 UTC m=+150.146788179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.772063 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:33 crc kubenswrapper[4992]: E0131 09:27:33.772871 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:34.272851469 +0000 UTC m=+150.244243446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.861489 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-31 09:22:32 +0000 UTC, rotation deadline is 2026-12-13 21:36:04.985996456 +0000 UTC Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.861535 4992 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7596h8m31.124465308s for next certificate rotation Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.873923 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:33 crc kubenswrapper[4992]: E0131 09:27:33.874364 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:34.374344824 +0000 UTC m=+150.345736871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.955613 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:33 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:33 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:33 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.955677 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.976856 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:33 crc kubenswrapper[4992]: E0131 09:27:33.977008 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:34.476984852 +0000 UTC m=+150.448376839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:33 crc kubenswrapper[4992]: I0131 09:27:33.977114 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:33 crc kubenswrapper[4992]: E0131 09:27:33.977464 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:34.477453506 +0000 UTC m=+150.448845493 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.077950 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:34 crc kubenswrapper[4992]: E0131 09:27:34.078178 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:34.578161828 +0000 UTC m=+150.549553815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.178798 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:34 crc kubenswrapper[4992]: E0131 09:27:34.179120 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:34.679107847 +0000 UTC m=+150.650499834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.279707 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:34 crc kubenswrapper[4992]: E0131 09:27:34.279917 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:34.77988736 +0000 UTC m=+150.751279347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.279969 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:34 crc kubenswrapper[4992]: E0131 09:27:34.280246 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:34.780234241 +0000 UTC m=+150.751626228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.359398 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4drj6"] Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.360314 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.368688 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.377972 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4drj6"] Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.380393 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:34 crc kubenswrapper[4992]: E0131 09:27:34.380534 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:34.88051191 +0000 UTC m=+150.851903897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.380598 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c4d6b90-976c-46f4-b55f-26d3277cc754-catalog-content\") pod \"certified-operators-4drj6\" (UID: \"7c4d6b90-976c-46f4-b55f-26d3277cc754\") " pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.380621 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c4d6b90-976c-46f4-b55f-26d3277cc754-utilities\") pod \"certified-operators-4drj6\" (UID: \"7c4d6b90-976c-46f4-b55f-26d3277cc754\") " pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.380641 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9nwc\" (UniqueName: \"kubernetes.io/projected/7c4d6b90-976c-46f4-b55f-26d3277cc754-kube-api-access-k9nwc\") pod \"certified-operators-4drj6\" (UID: \"7c4d6b90-976c-46f4-b55f-26d3277cc754\") " pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.380680 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:34 crc kubenswrapper[4992]: E0131 09:27:34.380925 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:34.880914132 +0000 UTC m=+150.852306119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.481774 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:34 crc kubenswrapper[4992]: E0131 09:27:34.482171 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:34.982111658 +0000 UTC m=+150.953503645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.482323 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c4d6b90-976c-46f4-b55f-26d3277cc754-utilities\") pod \"certified-operators-4drj6\" (UID: \"7c4d6b90-976c-46f4-b55f-26d3277cc754\") " pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.482408 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c4d6b90-976c-46f4-b55f-26d3277cc754-catalog-content\") pod \"certified-operators-4drj6\" (UID: \"7c4d6b90-976c-46f4-b55f-26d3277cc754\") " pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.482503 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9nwc\" (UniqueName: \"kubernetes.io/projected/7c4d6b90-976c-46f4-b55f-26d3277cc754-kube-api-access-k9nwc\") pod \"certified-operators-4drj6\" (UID: \"7c4d6b90-976c-46f4-b55f-26d3277cc754\") " pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.483183 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c4d6b90-976c-46f4-b55f-26d3277cc754-utilities\") pod \"certified-operators-4drj6\" (UID: \"7c4d6b90-976c-46f4-b55f-26d3277cc754\") " pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.483486 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c4d6b90-976c-46f4-b55f-26d3277cc754-catalog-content\") pod \"certified-operators-4drj6\" (UID: \"7c4d6b90-976c-46f4-b55f-26d3277cc754\") " pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.518120 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9nwc\" (UniqueName: \"kubernetes.io/projected/7c4d6b90-976c-46f4-b55f-26d3277cc754-kube-api-access-k9nwc\") pod \"certified-operators-4drj6\" (UID: \"7c4d6b90-976c-46f4-b55f-26d3277cc754\") " pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.548316 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b7j8l"] Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.549530 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.556971 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.581527 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b7j8l"] Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.581805 4992 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kc7bw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.586457 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" podUID="fbfb9fcb-64ca-4640-a230-4212004d2494" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.586380 4992 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-z2gbk container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.586673 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" podUID="fd0c1366-dfbf-487e-98a4-94fb4be75045" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.588301 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/533c10ab-faa7-4a62-8e8a-2ebd87578ced-utilities\") pod \"community-operators-b7j8l\" (UID: \"533c10ab-faa7-4a62-8e8a-2ebd87578ced\") " pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.588442 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/533c10ab-faa7-4a62-8e8a-2ebd87578ced-catalog-content\") pod \"community-operators-b7j8l\" (UID: \"533c10ab-faa7-4a62-8e8a-2ebd87578ced\") " pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.588540 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb5wd\" (UniqueName: \"kubernetes.io/projected/533c10ab-faa7-4a62-8e8a-2ebd87578ced-kube-api-access-rb5wd\") pod \"community-operators-b7j8l\" (UID: \"533c10ab-faa7-4a62-8e8a-2ebd87578ced\") " pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.588631 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:34 crc kubenswrapper[4992]: E0131 09:27:34.588935 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:35.088924109 +0000 UTC m=+151.060316096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.621592 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f738e73aa661b37e49e72a5b1b618f0ebc8571454aa3654c05c4e48164a327c5"} Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.640604 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7b5ce202a43271698016489805347cb62757c417128d176829b388fc77cf8076"} Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.652837 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d590e5f9adc23474f7df5021ba4eb082001d965839b709590123362be786b072"} Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.688996 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.689120 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:34 crc kubenswrapper[4992]: E0131 09:27:34.689191 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:35.189172967 +0000 UTC m=+151.160564954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.690329 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb5wd\" (UniqueName: \"kubernetes.io/projected/533c10ab-faa7-4a62-8e8a-2ebd87578ced-kube-api-access-rb5wd\") pod \"community-operators-b7j8l\" (UID: \"533c10ab-faa7-4a62-8e8a-2ebd87578ced\") " pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.690517 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.690724 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/533c10ab-faa7-4a62-8e8a-2ebd87578ced-utilities\") pod \"community-operators-b7j8l\" (UID: \"533c10ab-faa7-4a62-8e8a-2ebd87578ced\") " pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.690829 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/533c10ab-faa7-4a62-8e8a-2ebd87578ced-catalog-content\") pod \"community-operators-b7j8l\" (UID: \"533c10ab-faa7-4a62-8e8a-2ebd87578ced\") " pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:27:34 crc kubenswrapper[4992]: E0131 09:27:34.691205 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:35.191185786 +0000 UTC m=+151.162577803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.691495 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/533c10ab-faa7-4a62-8e8a-2ebd87578ced-catalog-content\") pod \"community-operators-b7j8l\" (UID: \"533c10ab-faa7-4a62-8e8a-2ebd87578ced\") " pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.692019 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/533c10ab-faa7-4a62-8e8a-2ebd87578ced-utilities\") pod \"community-operators-b7j8l\" (UID: \"533c10ab-faa7-4a62-8e8a-2ebd87578ced\") " pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.733148 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb5wd\" (UniqueName: \"kubernetes.io/projected/533c10ab-faa7-4a62-8e8a-2ebd87578ced-kube-api-access-rb5wd\") pod \"community-operators-b7j8l\" (UID: \"533c10ab-faa7-4a62-8e8a-2ebd87578ced\") " pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.745954 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j9mrh"] Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.759324 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.801347 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:34 crc kubenswrapper[4992]: E0131 09:27:34.801924 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:35.301904641 +0000 UTC m=+151.273296628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.817700 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j9mrh"] Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.877968 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.915220 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctmrx\" (UniqueName: \"kubernetes.io/projected/0377ed6d-ea6e-44cb-9d09-0c817af64b22-kube-api-access-ctmrx\") pod \"certified-operators-j9mrh\" (UID: \"0377ed6d-ea6e-44cb-9d09-0c817af64b22\") " pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.915271 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.915469 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0377ed6d-ea6e-44cb-9d09-0c817af64b22-utilities\") pod \"certified-operators-j9mrh\" (UID: \"0377ed6d-ea6e-44cb-9d09-0c817af64b22\") " pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:27:34 crc kubenswrapper[4992]: E0131 09:27:34.915545 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:35.415533182 +0000 UTC m=+151.386925169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.915610 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0377ed6d-ea6e-44cb-9d09-0c817af64b22-catalog-content\") pod \"certified-operators-j9mrh\" (UID: \"0377ed6d-ea6e-44cb-9d09-0c817af64b22\") " pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.946275 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:34 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:34 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:34 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:34 crc kubenswrapper[4992]: I0131 09:27:34.946356 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.021209 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.021859 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0377ed6d-ea6e-44cb-9d09-0c817af64b22-catalog-content\") pod \"certified-operators-j9mrh\" (UID: \"0377ed6d-ea6e-44cb-9d09-0c817af64b22\") " pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.021891 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctmrx\" (UniqueName: \"kubernetes.io/projected/0377ed6d-ea6e-44cb-9d09-0c817af64b22-kube-api-access-ctmrx\") pod \"certified-operators-j9mrh\" (UID: \"0377ed6d-ea6e-44cb-9d09-0c817af64b22\") " pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.021979 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0377ed6d-ea6e-44cb-9d09-0c817af64b22-utilities\") pod \"certified-operators-j9mrh\" (UID: \"0377ed6d-ea6e-44cb-9d09-0c817af64b22\") " pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.022455 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0377ed6d-ea6e-44cb-9d09-0c817af64b22-utilities\") pod \"certified-operators-j9mrh\" (UID: \"0377ed6d-ea6e-44cb-9d09-0c817af64b22\") " pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:27:35 crc kubenswrapper[4992]: E0131 09:27:35.022540 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:35.522522268 +0000 UTC m=+151.493914255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.022815 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0377ed6d-ea6e-44cb-9d09-0c817af64b22-catalog-content\") pod \"certified-operators-j9mrh\" (UID: \"0377ed6d-ea6e-44cb-9d09-0c817af64b22\") " pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.024567 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rmbpb"] Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.025567 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.041566 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.042357 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.054022 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.054265 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.096614 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rmbpb"] Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.097529 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctmrx\" (UniqueName: \"kubernetes.io/projected/0377ed6d-ea6e-44cb-9d09-0c817af64b22-kube-api-access-ctmrx\") pod \"certified-operators-j9mrh\" (UID: \"0377ed6d-ea6e-44cb-9d09-0c817af64b22\") " pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.119389 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.122823 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.122991 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1688efda-4764-4e8b-b5cf-5544ef6edad8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1688efda-4764-4e8b-b5cf-5544ef6edad8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.123108 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1688efda-4764-4e8b-b5cf-5544ef6edad8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1688efda-4764-4e8b-b5cf-5544ef6edad8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.123281 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-catalog-content\") pod \"community-operators-rmbpb\" (UID: \"4aab422a-915a-4fd8-a9f2-3f04bdaee9da\") " pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.123436 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-utilities\") pod \"community-operators-rmbpb\" (UID: \"4aab422a-915a-4fd8-a9f2-3f04bdaee9da\") " pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.123567 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vmvw\" (UniqueName: \"kubernetes.io/projected/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-kube-api-access-5vmvw\") pod \"community-operators-rmbpb\" (UID: \"4aab422a-915a-4fd8-a9f2-3f04bdaee9da\") " pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:27:35 crc kubenswrapper[4992]: E0131 09:27:35.124034 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:35.624018443 +0000 UTC m=+151.595410430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.130731 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.226901 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.227183 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-utilities\") pod \"community-operators-rmbpb\" (UID: \"4aab422a-915a-4fd8-a9f2-3f04bdaee9da\") " pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.227221 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vmvw\" (UniqueName: \"kubernetes.io/projected/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-kube-api-access-5vmvw\") pod \"community-operators-rmbpb\" (UID: \"4aab422a-915a-4fd8-a9f2-3f04bdaee9da\") " pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.227293 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1688efda-4764-4e8b-b5cf-5544ef6edad8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1688efda-4764-4e8b-b5cf-5544ef6edad8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.227317 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1688efda-4764-4e8b-b5cf-5544ef6edad8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1688efda-4764-4e8b-b5cf-5544ef6edad8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.227369 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-catalog-content\") pod \"community-operators-rmbpb\" (UID: \"4aab422a-915a-4fd8-a9f2-3f04bdaee9da\") " pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.228308 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-catalog-content\") pod \"community-operators-rmbpb\" (UID: \"4aab422a-915a-4fd8-a9f2-3f04bdaee9da\") " pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.228640 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-utilities\") pod \"community-operators-rmbpb\" (UID: \"4aab422a-915a-4fd8-a9f2-3f04bdaee9da\") " pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:27:35 crc kubenswrapper[4992]: E0131 09:27:35.229133 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:35.729112753 +0000 UTC m=+151.700504740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.229182 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1688efda-4764-4e8b-b5cf-5544ef6edad8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1688efda-4764-4e8b-b5cf-5544ef6edad8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.292242 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1688efda-4764-4e8b-b5cf-5544ef6edad8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1688efda-4764-4e8b-b5cf-5544ef6edad8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.303346 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vmvw\" (UniqueName: \"kubernetes.io/projected/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-kube-api-access-5vmvw\") pod \"community-operators-rmbpb\" (UID: \"4aab422a-915a-4fd8-a9f2-3f04bdaee9da\") " pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.335137 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:35 crc kubenswrapper[4992]: E0131 09:27:35.335519 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:35.835501262 +0000 UTC m=+151.806893249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.376961 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.444470 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:35 crc kubenswrapper[4992]: E0131 09:27:35.444891 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:35.944851497 +0000 UTC m=+151.916243484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.447941 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.533894 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z2gbk" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.549638 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:35 crc kubenswrapper[4992]: E0131 09:27:35.551051 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:36.051036118 +0000 UTC m=+152.022428105 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.570799 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4drj6"] Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.650720 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:35 crc kubenswrapper[4992]: E0131 09:27:35.651587 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:36.151566645 +0000 UTC m=+152.122958632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.714166 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e7e723ba38dfcf74bffe624e0e0923cc3ea6cce15451fedffe11675792d6b0fa"} Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.714439 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.730230 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3c005937950e4bb12366c82969c069fe3d33febcfac0bdfeacfe31fb8d6899eb"} Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.753333 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:35 crc kubenswrapper[4992]: E0131 09:27:35.754191 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:36.254167362 +0000 UTC m=+152.225559399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.754762 4992 generic.go:334] "Generic (PLEG): container finished" podID="65c59658-5ed8-4cef-b36d-2a1e44ec6976" containerID="a5f07f12e53bc482271dbac8b5d3aec9e8654c73d41a2db7e4becbb382281eea" exitCode=0 Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.754829 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" event={"ID":"65c59658-5ed8-4cef-b36d-2a1e44ec6976","Type":"ContainerDied","Data":"a5f07f12e53bc482271dbac8b5d3aec9e8654c73d41a2db7e4becbb382281eea"} Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.821555 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7943ece7849c5cd00b865c8276ed8fd95660b642ff4602cab78baab7596903f2"} Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.857056 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:35 crc kubenswrapper[4992]: E0131 09:27:35.857561 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:36.357540312 +0000 UTC m=+152.328932299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.878456 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b7j8l"] Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.949161 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:35 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:35 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:35 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.949590 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:35 crc kubenswrapper[4992]: I0131 09:27:35.959879 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:35 crc kubenswrapper[4992]: E0131 09:27:35.960234 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:36.460221302 +0000 UTC m=+152.431613289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.061268 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:36 crc kubenswrapper[4992]: E0131 09:27:36.061634 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:36.561614384 +0000 UTC m=+152.533006381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.077286 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j9mrh"] Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.163486 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:36 crc kubenswrapper[4992]: E0131 09:27:36.164145 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:36.664129639 +0000 UTC m=+152.635521626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.179604 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.266211 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:36 crc kubenswrapper[4992]: E0131 09:27:36.266387 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:36.766363145 +0000 UTC m=+152.737755132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.266508 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:36 crc kubenswrapper[4992]: E0131 09:27:36.266854 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:36.766836679 +0000 UTC m=+152.738228666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.367274 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:36 crc kubenswrapper[4992]: E0131 09:27:36.367727 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:36.867677505 +0000 UTC m=+152.839069502 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.387761 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rmbpb"] Jan 31 09:27:36 crc kubenswrapper[4992]: W0131 09:27:36.395793 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aab422a_915a_4fd8_a9f2_3f04bdaee9da.slice/crio-844bc7a9bb1ad388abca2faa9adb6f39f8f2b97274519ed7599b855bddf0557e WatchSource:0}: Error finding container 844bc7a9bb1ad388abca2faa9adb6f39f8f2b97274519ed7599b855bddf0557e: Status 404 returned error can't find the container with id 844bc7a9bb1ad388abca2faa9adb6f39f8f2b97274519ed7599b855bddf0557e Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.420840 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.468965 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:36 crc kubenswrapper[4992]: E0131 09:27:36.471115 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:36.971096926 +0000 UTC m=+152.942488913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.570095 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:36 crc kubenswrapper[4992]: E0131 09:27:36.570522 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:37.070505 +0000 UTC m=+153.041896987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.597873 4992 patch_prober.go:28] interesting pod/apiserver-76f77b778f-68zwk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 31 09:27:36 crc kubenswrapper[4992]: [+]log ok Jan 31 09:27:36 crc kubenswrapper[4992]: [+]etcd ok Jan 31 09:27:36 crc kubenswrapper[4992]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 31 09:27:36 crc kubenswrapper[4992]: [+]poststarthook/generic-apiserver-start-informers ok Jan 31 09:27:36 crc kubenswrapper[4992]: [+]poststarthook/max-in-flight-filter ok Jan 31 09:27:36 crc kubenswrapper[4992]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 31 09:27:36 crc kubenswrapper[4992]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 31 09:27:36 crc kubenswrapper[4992]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 31 09:27:36 crc kubenswrapper[4992]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 31 09:27:36 crc kubenswrapper[4992]: [+]poststarthook/project.openshift.io-projectcache ok Jan 31 09:27:36 crc kubenswrapper[4992]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 31 09:27:36 crc kubenswrapper[4992]: [+]poststarthook/openshift.io-startinformers ok Jan 31 09:27:36 crc kubenswrapper[4992]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 31 09:27:36 crc kubenswrapper[4992]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 31 09:27:36 crc kubenswrapper[4992]: livez check failed Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.597964 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-68zwk" podUID="6e0e4407-bfda-4d16-9e9e-d9065286a07d" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.671761 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:36 crc kubenswrapper[4992]: E0131 09:27:36.672114 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:37.172098008 +0000 UTC m=+153.143489995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.722913 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2qtkp"] Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.724196 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.729354 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.755456 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qtkp"] Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.769978 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.772590 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:36 crc kubenswrapper[4992]: E0131 09:27:36.772973 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:37.272952084 +0000 UTC m=+153.244344081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.773245 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:36 crc kubenswrapper[4992]: E0131 09:27:36.773516 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:37.27350764 +0000 UTC m=+153.244899627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.840245 4992 generic.go:334] "Generic (PLEG): container finished" podID="7c4d6b90-976c-46f4-b55f-26d3277cc754" containerID="857fe9d991ab2b833c2200c5b86fc8fa696ba81321cc86eda3e69d43f43e688b" exitCode=0 Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.840880 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4drj6" event={"ID":"7c4d6b90-976c-46f4-b55f-26d3277cc754","Type":"ContainerDied","Data":"857fe9d991ab2b833c2200c5b86fc8fa696ba81321cc86eda3e69d43f43e688b"} Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.840940 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4drj6" event={"ID":"7c4d6b90-976c-46f4-b55f-26d3277cc754","Type":"ContainerStarted","Data":"9c80b324340349f5b257c78500fc07a5541884ace517d1a85b26be695b5aabdb"} Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.843319 4992 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.856671 4992 generic.go:334] "Generic (PLEG): container finished" podID="533c10ab-faa7-4a62-8e8a-2ebd87578ced" containerID="ced9047484f4014ac1a1d057112a2d14f2b2edb8a49597b312bfbe5817dd32b4" exitCode=0 Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.856788 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7j8l" event={"ID":"533c10ab-faa7-4a62-8e8a-2ebd87578ced","Type":"ContainerDied","Data":"ced9047484f4014ac1a1d057112a2d14f2b2edb8a49597b312bfbe5817dd32b4"} Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.856845 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7j8l" event={"ID":"533c10ab-faa7-4a62-8e8a-2ebd87578ced","Type":"ContainerStarted","Data":"6e9cb256a735d92bc735ce08030d7d9f34ebeb99e0532a0b78364f70d3846e60"} Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.862340 4992 patch_prober.go:28] interesting pod/downloads-7954f5f757-jgrjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.862513 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jgrjj" podUID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.863045 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmbpb" event={"ID":"4aab422a-915a-4fd8-a9f2-3f04bdaee9da","Type":"ContainerStarted","Data":"844bc7a9bb1ad388abca2faa9adb6f39f8f2b97274519ed7599b855bddf0557e"} Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.864635 4992 patch_prober.go:28] interesting pod/downloads-7954f5f757-jgrjj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.864704 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jgrjj" podUID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.868958 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" event={"ID":"24d2a857-eb20-4eb7-acb2-077e53af8b03","Type":"ContainerStarted","Data":"700b9b211fa32eb2b0eed77024392246cf5dd55a686fe5190781b68eff688d33"} Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.876437 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.876992 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-utilities\") pod \"redhat-marketplace-2qtkp\" (UID: \"2cef084e-8345-4b18-ade1-4cc6e9fbfd09\") " pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.877080 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np6ql\" (UniqueName: \"kubernetes.io/projected/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-kube-api-access-np6ql\") pod \"redhat-marketplace-2qtkp\" (UID: \"2cef084e-8345-4b18-ade1-4cc6e9fbfd09\") " pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.877236 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-catalog-content\") pod \"redhat-marketplace-2qtkp\" (UID: \"2cef084e-8345-4b18-ade1-4cc6e9fbfd09\") " pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:27:36 crc kubenswrapper[4992]: E0131 09:27:36.877396 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:37.377376125 +0000 UTC m=+153.348768112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.892855 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1688efda-4764-4e8b-b5cf-5544ef6edad8","Type":"ContainerStarted","Data":"df2213682fae6df44c971284e3d852a36afbbeb048583189c8af631d2a6bb8cc"} Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.892915 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1688efda-4764-4e8b-b5cf-5544ef6edad8","Type":"ContainerStarted","Data":"cfd918e31b9435feffa2c2ce58426767344b384c8a967c5af4f0897c2a89e216"} Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.925073 4992 generic.go:334] "Generic (PLEG): container finished" podID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" containerID="4ae68a2eae1d3a32859991118e6bc242e7005a525e112e1b7d55e956a7a03051" exitCode=0 Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.925667 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9mrh" event={"ID":"0377ed6d-ea6e-44cb-9d09-0c817af64b22","Type":"ContainerDied","Data":"4ae68a2eae1d3a32859991118e6bc242e7005a525e112e1b7d55e956a7a03051"} Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.925755 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9mrh" event={"ID":"0377ed6d-ea6e-44cb-9d09-0c817af64b22","Type":"ContainerStarted","Data":"26250c8d497c57eb4891e22fe2d61a35a354952bf3aac1f5cebc5a9af161fe34"} Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.935450 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.943153 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.948224 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:36 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:36 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:36 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.948597 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.958527 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2fk8c" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.963615 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.969525 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.969501635 podStartE2EDuration="2.969501635s" podCreationTimestamp="2026-01-31 09:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:36.935021255 +0000 UTC m=+152.906413262" watchObservedRunningTime="2026-01-31 09:27:36.969501635 +0000 UTC m=+152.940893632" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.979949 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-catalog-content\") pod \"redhat-marketplace-2qtkp\" (UID: \"2cef084e-8345-4b18-ade1-4cc6e9fbfd09\") " pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.980141 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-utilities\") pod \"redhat-marketplace-2qtkp\" (UID: \"2cef084e-8345-4b18-ade1-4cc6e9fbfd09\") " pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.980195 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.980246 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np6ql\" (UniqueName: \"kubernetes.io/projected/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-kube-api-access-np6ql\") pod \"redhat-marketplace-2qtkp\" (UID: \"2cef084e-8345-4b18-ade1-4cc6e9fbfd09\") " pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.981665 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-utilities\") pod \"redhat-marketplace-2qtkp\" (UID: \"2cef084e-8345-4b18-ade1-4cc6e9fbfd09\") " pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:27:36 crc kubenswrapper[4992]: E0131 09:27:36.982635 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:37.48262204 +0000 UTC m=+153.454014027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:36 crc kubenswrapper[4992]: I0131 09:27:36.994541 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-catalog-content\") pod \"redhat-marketplace-2qtkp\" (UID: \"2cef084e-8345-4b18-ade1-4cc6e9fbfd09\") " pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.033879 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np6ql\" (UniqueName: \"kubernetes.io/projected/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-kube-api-access-np6ql\") pod \"redhat-marketplace-2qtkp\" (UID: \"2cef084e-8345-4b18-ade1-4cc6e9fbfd09\") " pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.068303 4992 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.073290 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.081505 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:37 crc kubenswrapper[4992]: E0131 09:27:37.081873 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:37.581852658 +0000 UTC m=+153.553244645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.142582 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.143773 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.161631 4992 patch_prober.go:28] interesting pod/console-f9d7485db-7bjlw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.161685 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7bjlw" podUID="ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.164197 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-6nssv" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.186313 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:37 crc kubenswrapper[4992]: E0131 09:27:37.186739 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:37.686721752 +0000 UTC m=+153.658113739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.210554 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2b45w"] Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.211896 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.227710 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2b45w"] Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.289570 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:37 crc kubenswrapper[4992]: E0131 09:27:37.289892 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:37.789864295 +0000 UTC m=+153.761256292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.289928 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:37 crc kubenswrapper[4992]: E0131 09:27:37.291639 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:37.791626337 +0000 UTC m=+153.763018324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.365511 4992 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-31T09:27:37.068330752Z","Handler":null,"Name":""} Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.394007 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.394301 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd981c2-7781-4f04-8ef1-73219837a007-catalog-content\") pod \"redhat-marketplace-2b45w\" (UID: \"1cd981c2-7781-4f04-8ef1-73219837a007\") " pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.394347 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd981c2-7781-4f04-8ef1-73219837a007-utilities\") pod \"redhat-marketplace-2b45w\" (UID: \"1cd981c2-7781-4f04-8ef1-73219837a007\") " pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.394377 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr2jx\" (UniqueName: \"kubernetes.io/projected/1cd981c2-7781-4f04-8ef1-73219837a007-kube-api-access-zr2jx\") pod \"redhat-marketplace-2b45w\" (UID: \"1cd981c2-7781-4f04-8ef1-73219837a007\") " pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:27:37 crc kubenswrapper[4992]: E0131 09:27:37.394927 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:37.894908114 +0000 UTC m=+153.866300101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.424955 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.431977 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kc7bw" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.443727 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-25wn6" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.496177 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr2jx\" (UniqueName: \"kubernetes.io/projected/1cd981c2-7781-4f04-8ef1-73219837a007-kube-api-access-zr2jx\") pod \"redhat-marketplace-2b45w\" (UID: \"1cd981c2-7781-4f04-8ef1-73219837a007\") " pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.496258 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.496358 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd981c2-7781-4f04-8ef1-73219837a007-catalog-content\") pod \"redhat-marketplace-2b45w\" (UID: \"1cd981c2-7781-4f04-8ef1-73219837a007\") " pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.496386 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd981c2-7781-4f04-8ef1-73219837a007-utilities\") pod \"redhat-marketplace-2b45w\" (UID: \"1cd981c2-7781-4f04-8ef1-73219837a007\") " pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.496871 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd981c2-7781-4f04-8ef1-73219837a007-utilities\") pod \"redhat-marketplace-2b45w\" (UID: \"1cd981c2-7781-4f04-8ef1-73219837a007\") " pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:27:37 crc kubenswrapper[4992]: E0131 09:27:37.497587 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:27:37.997572384 +0000 UTC m=+153.968964371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-j6dj7" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.498013 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd981c2-7781-4f04-8ef1-73219837a007-catalog-content\") pod \"redhat-marketplace-2b45w\" (UID: \"1cd981c2-7781-4f04-8ef1-73219837a007\") " pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.562908 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.572467 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9xmbj"] Jan 31 09:27:37 crc kubenswrapper[4992]: E0131 09:27:37.572728 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c59658-5ed8-4cef-b36d-2a1e44ec6976" containerName="collect-profiles" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.572743 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c59658-5ed8-4cef-b36d-2a1e44ec6976" containerName="collect-profiles" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.572851 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="65c59658-5ed8-4cef-b36d-2a1e44ec6976" containerName="collect-profiles" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.573671 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.581401 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr2jx\" (UniqueName: \"kubernetes.io/projected/1cd981c2-7781-4f04-8ef1-73219837a007-kube-api-access-zr2jx\") pod \"redhat-marketplace-2b45w\" (UID: \"1cd981c2-7781-4f04-8ef1-73219837a007\") " pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.599116 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.601107 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.601257 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.602028 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:27:37 crc kubenswrapper[4992]: E0131 09:27:37.602381 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:27:38.102363255 +0000 UTC m=+154.073755242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.602740 4992 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.602773 4992 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.604044 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.606397 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.629999 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.666901 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9xmbj"] Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.714505 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65c59658-5ed8-4cef-b36d-2a1e44ec6976-config-volume\") pod \"65c59658-5ed8-4cef-b36d-2a1e44ec6976\" (UID: \"65c59658-5ed8-4cef-b36d-2a1e44ec6976\") " Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.714762 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/65c59658-5ed8-4cef-b36d-2a1e44ec6976-secret-volume\") pod \"65c59658-5ed8-4cef-b36d-2a1e44ec6976\" (UID: \"65c59658-5ed8-4cef-b36d-2a1e44ec6976\") " Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.714841 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhnsb\" (UniqueName: \"kubernetes.io/projected/65c59658-5ed8-4cef-b36d-2a1e44ec6976-kube-api-access-rhnsb\") pod \"65c59658-5ed8-4cef-b36d-2a1e44ec6976\" (UID: \"65c59658-5ed8-4cef-b36d-2a1e44ec6976\") " Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.715260 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b45f463b-e11b-4b00-b459-4c401197e9c5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b45f463b-e11b-4b00-b459-4c401197e9c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.715639 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b45f463b-e11b-4b00-b459-4c401197e9c5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b45f463b-e11b-4b00-b459-4c401197e9c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.715944 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de317207-ef37-4247-9d1d-279570141ebc-utilities\") pod \"redhat-operators-9xmbj\" (UID: \"de317207-ef37-4247-9d1d-279570141ebc\") " pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.716400 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdpxs\" (UniqueName: \"kubernetes.io/projected/de317207-ef37-4247-9d1d-279570141ebc-kube-api-access-fdpxs\") pod \"redhat-operators-9xmbj\" (UID: \"de317207-ef37-4247-9d1d-279570141ebc\") " pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.717969 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65c59658-5ed8-4cef-b36d-2a1e44ec6976-config-volume" (OuterVolumeSpecName: "config-volume") pod "65c59658-5ed8-4cef-b36d-2a1e44ec6976" (UID: "65c59658-5ed8-4cef-b36d-2a1e44ec6976"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.720259 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.720343 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de317207-ef37-4247-9d1d-279570141ebc-catalog-content\") pod \"redhat-operators-9xmbj\" (UID: \"de317207-ef37-4247-9d1d-279570141ebc\") " pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.735185 4992 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65c59658-5ed8-4cef-b36d-2a1e44ec6976-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.758451 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pt48t"] Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.759941 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pt48t"] Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.760028 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.785508 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c59658-5ed8-4cef-b36d-2a1e44ec6976-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "65c59658-5ed8-4cef-b36d-2a1e44ec6976" (UID: "65c59658-5ed8-4cef-b36d-2a1e44ec6976"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.794190 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65c59658-5ed8-4cef-b36d-2a1e44ec6976-kube-api-access-rhnsb" (OuterVolumeSpecName: "kube-api-access-rhnsb") pod "65c59658-5ed8-4cef-b36d-2a1e44ec6976" (UID: "65c59658-5ed8-4cef-b36d-2a1e44ec6976"). InnerVolumeSpecName "kube-api-access-rhnsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.820562 4992 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.820606 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.841214 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a074f4f-b7f6-4892-be89-083d619c0771-utilities\") pod \"redhat-operators-pt48t\" (UID: \"6a074f4f-b7f6-4892-be89-083d619c0771\") " pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.841297 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de317207-ef37-4247-9d1d-279570141ebc-catalog-content\") pod \"redhat-operators-9xmbj\" (UID: \"de317207-ef37-4247-9d1d-279570141ebc\") " pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.841321 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzp28\" (UniqueName: \"kubernetes.io/projected/6a074f4f-b7f6-4892-be89-083d619c0771-kube-api-access-kzp28\") pod \"redhat-operators-pt48t\" (UID: \"6a074f4f-b7f6-4892-be89-083d619c0771\") " pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.841409 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b45f463b-e11b-4b00-b459-4c401197e9c5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b45f463b-e11b-4b00-b459-4c401197e9c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.841478 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b45f463b-e11b-4b00-b459-4c401197e9c5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b45f463b-e11b-4b00-b459-4c401197e9c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.841521 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de317207-ef37-4247-9d1d-279570141ebc-utilities\") pod \"redhat-operators-9xmbj\" (UID: \"de317207-ef37-4247-9d1d-279570141ebc\") " pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.841550 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a074f4f-b7f6-4892-be89-083d619c0771-catalog-content\") pod \"redhat-operators-pt48t\" (UID: \"6a074f4f-b7f6-4892-be89-083d619c0771\") " pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.841573 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdpxs\" (UniqueName: \"kubernetes.io/projected/de317207-ef37-4247-9d1d-279570141ebc-kube-api-access-fdpxs\") pod \"redhat-operators-9xmbj\" (UID: \"de317207-ef37-4247-9d1d-279570141ebc\") " pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.841625 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhnsb\" (UniqueName: \"kubernetes.io/projected/65c59658-5ed8-4cef-b36d-2a1e44ec6976-kube-api-access-rhnsb\") on node \"crc\" DevicePath \"\"" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.841642 4992 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/65c59658-5ed8-4cef-b36d-2a1e44ec6976-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.844240 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b45f463b-e11b-4b00-b459-4c401197e9c5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b45f463b-e11b-4b00-b459-4c401197e9c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.846683 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de317207-ef37-4247-9d1d-279570141ebc-catalog-content\") pod \"redhat-operators-9xmbj\" (UID: \"de317207-ef37-4247-9d1d-279570141ebc\") " pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.847052 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de317207-ef37-4247-9d1d-279570141ebc-utilities\") pod \"redhat-operators-9xmbj\" (UID: \"de317207-ef37-4247-9d1d-279570141ebc\") " pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.859353 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-j6dj7\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.866274 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdpxs\" (UniqueName: \"kubernetes.io/projected/de317207-ef37-4247-9d1d-279570141ebc-kube-api-access-fdpxs\") pod \"redhat-operators-9xmbj\" (UID: \"de317207-ef37-4247-9d1d-279570141ebc\") " pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.870643 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.879731 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b45f463b-e11b-4b00-b459-4c401197e9c5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b45f463b-e11b-4b00-b459-4c401197e9c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.932173 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.943828 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.944465 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a074f4f-b7f6-4892-be89-083d619c0771-catalog-content\") pod \"redhat-operators-pt48t\" (UID: \"6a074f4f-b7f6-4892-be89-083d619c0771\") " pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.944535 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a074f4f-b7f6-4892-be89-083d619c0771-utilities\") pod \"redhat-operators-pt48t\" (UID: \"6a074f4f-b7f6-4892-be89-083d619c0771\") " pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.944573 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzp28\" (UniqueName: \"kubernetes.io/projected/6a074f4f-b7f6-4892-be89-083d619c0771-kube-api-access-kzp28\") pod \"redhat-operators-pt48t\" (UID: \"6a074f4f-b7f6-4892-be89-083d619c0771\") " pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.944627 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:37 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:37 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:37 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.944696 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.945773 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a074f4f-b7f6-4892-be89-083d619c0771-catalog-content\") pod \"redhat-operators-pt48t\" (UID: \"6a074f4f-b7f6-4892-be89-083d619c0771\") " pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.947116 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a074f4f-b7f6-4892-be89-083d619c0771-utilities\") pod \"redhat-operators-pt48t\" (UID: \"6a074f4f-b7f6-4892-be89-083d619c0771\") " pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.949001 4992 generic.go:334] "Generic (PLEG): container finished" podID="4aab422a-915a-4fd8-a9f2-3f04bdaee9da" containerID="39fdb6bc86da38fb7399f8a5b5e925013e96c8b541d073d0ec07a059f33a0a99" exitCode=0 Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.949115 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmbpb" event={"ID":"4aab422a-915a-4fd8-a9f2-3f04bdaee9da","Type":"ContainerDied","Data":"39fdb6bc86da38fb7399f8a5b5e925013e96c8b541d073d0ec07a059f33a0a99"} Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.959260 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" event={"ID":"24d2a857-eb20-4eb7-acb2-077e53af8b03","Type":"ContainerStarted","Data":"0485a0342f2e0a3d364ff2adef71f3d425bfe48095229d2f07580ead5416a60b"} Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.967236 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.972965 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qtkp"] Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.980075 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" event={"ID":"65c59658-5ed8-4cef-b36d-2a1e44ec6976","Type":"ContainerDied","Data":"7506ff95ec461360716fe2190e78dae2158766489ff856cbeb956d2e24d55980"} Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.980119 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7506ff95ec461360716fe2190e78dae2158766489ff856cbeb956d2e24d55980" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.980223 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.983210 4992 generic.go:334] "Generic (PLEG): container finished" podID="1688efda-4764-4e8b-b5cf-5544ef6edad8" containerID="df2213682fae6df44c971284e3d852a36afbbeb048583189c8af631d2a6bb8cc" exitCode=0 Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.983285 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1688efda-4764-4e8b-b5cf-5544ef6edad8","Type":"ContainerDied","Data":"df2213682fae6df44c971284e3d852a36afbbeb048583189c8af631d2a6bb8cc"} Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.992500 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:27:37 crc kubenswrapper[4992]: I0131 09:27:37.995931 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzp28\" (UniqueName: \"kubernetes.io/projected/6a074f4f-b7f6-4892-be89-083d619c0771-kube-api-access-kzp28\") pod \"redhat-operators-pt48t\" (UID: \"6a074f4f-b7f6-4892-be89-083d619c0771\") " pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:27:38 crc kubenswrapper[4992]: I0131 09:27:38.046152 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:38 crc kubenswrapper[4992]: I0131 09:27:38.130974 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2b45w"] Jan 31 09:27:38 crc kubenswrapper[4992]: W0131 09:27:38.161613 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cd981c2_7781_4f04_8ef1_73219837a007.slice/crio-047d161dea7959b68e66d8e8d18d8142731e900978747ee23ffbccdc51e30a3b WatchSource:0}: Error finding container 047d161dea7959b68e66d8e8d18d8142731e900978747ee23ffbccdc51e30a3b: Status 404 returned error can't find the container with id 047d161dea7959b68e66d8e8d18d8142731e900978747ee23ffbccdc51e30a3b Jan 31 09:27:38 crc kubenswrapper[4992]: I0131 09:27:38.196247 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:27:38 crc kubenswrapper[4992]: I0131 09:27:38.480674 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-j6dj7"] Jan 31 09:27:38 crc kubenswrapper[4992]: I0131 09:27:38.589749 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9xmbj"] Jan 31 09:27:38 crc kubenswrapper[4992]: I0131 09:27:38.606880 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 31 09:27:38 crc kubenswrapper[4992]: W0131 09:27:38.631006 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde317207_ef37_4247_9d1d_279570141ebc.slice/crio-6c1967a90e05dc91b15c267515def712dc0b9d38117d8641165a7e0c2c01e9eb WatchSource:0}: Error finding container 6c1967a90e05dc91b15c267515def712dc0b9d38117d8641165a7e0c2c01e9eb: Status 404 returned error can't find the container with id 6c1967a90e05dc91b15c267515def712dc0b9d38117d8641165a7e0c2c01e9eb Jan 31 09:27:38 crc kubenswrapper[4992]: I0131 09:27:38.640346 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pt48t"] Jan 31 09:27:38 crc kubenswrapper[4992]: W0131 09:27:38.695853 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a074f4f_b7f6_4892_be89_083d619c0771.slice/crio-a74e41a32613e9e1a744d07ea44b6bac4fa6921d009f8c0df1074214a86d0e9b WatchSource:0}: Error finding container a74e41a32613e9e1a744d07ea44b6bac4fa6921d009f8c0df1074214a86d0e9b: Status 404 returned error can't find the container with id a74e41a32613e9e1a744d07ea44b6bac4fa6921d009f8c0df1074214a86d0e9b Jan 31 09:27:38 crc kubenswrapper[4992]: I0131 09:27:38.945676 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:38 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:38 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:38 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:38 crc kubenswrapper[4992]: I0131 09:27:38.945982 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:38 crc kubenswrapper[4992]: I0131 09:27:38.997136 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b45f463b-e11b-4b00-b459-4c401197e9c5","Type":"ContainerStarted","Data":"437327141d34cf287b38454796bf57e4aa1b4a42c65e4baba88c18ff10e9d7ca"} Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.001930 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" event={"ID":"754c2a0a-7622-4316-9706-e8499dd756a5","Type":"ContainerStarted","Data":"f50160ea56130f69a407cde35580a684c293b1eb690803fa21132cae1e58d7f7"} Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.001972 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" event={"ID":"754c2a0a-7622-4316-9706-e8499dd756a5","Type":"ContainerStarted","Data":"ad60edf821e1dcc0f27ebe4cb2fcb3c7fd7e4dec9e8b35b696ef5c5fb0ba5de9"} Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.002474 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.029638 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" event={"ID":"24d2a857-eb20-4eb7-acb2-077e53af8b03","Type":"ContainerStarted","Data":"c2c83c03a996e8912fd0afa03f48e4f8593ba5cceb69dc9d020e38ae955060d5"} Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.031006 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" podStartSLOduration=132.0309939 podStartE2EDuration="2m12.0309939s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:39.026545969 +0000 UTC m=+154.997937986" watchObservedRunningTime="2026-01-31 09:27:39.0309939 +0000 UTC m=+155.002385887" Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.040833 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pt48t" event={"ID":"6a074f4f-b7f6-4892-be89-083d619c0771","Type":"ContainerStarted","Data":"a74e41a32613e9e1a744d07ea44b6bac4fa6921d009f8c0df1074214a86d0e9b"} Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.049639 4992 generic.go:334] "Generic (PLEG): container finished" podID="1cd981c2-7781-4f04-8ef1-73219837a007" containerID="4268d54c718ce2d4e4fef85848dc80765aea64de81684b205624d59fe2b59848" exitCode=0 Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.049778 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b45w" event={"ID":"1cd981c2-7781-4f04-8ef1-73219837a007","Type":"ContainerDied","Data":"4268d54c718ce2d4e4fef85848dc80765aea64de81684b205624d59fe2b59848"} Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.049898 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b45w" event={"ID":"1cd981c2-7781-4f04-8ef1-73219837a007","Type":"ContainerStarted","Data":"047d161dea7959b68e66d8e8d18d8142731e900978747ee23ffbccdc51e30a3b"} Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.053759 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xmbj" event={"ID":"de317207-ef37-4247-9d1d-279570141ebc","Type":"ContainerStarted","Data":"6c1967a90e05dc91b15c267515def712dc0b9d38117d8641165a7e0c2c01e9eb"} Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.060463 4992 generic.go:334] "Generic (PLEG): container finished" podID="2cef084e-8345-4b18-ade1-4cc6e9fbfd09" containerID="3794f68ad026ec339d98f3b94995a7847285f64c04bc04d5b9b1b3e3c2cbae91" exitCode=0 Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.061055 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qtkp" event={"ID":"2cef084e-8345-4b18-ade1-4cc6e9fbfd09","Type":"ContainerDied","Data":"3794f68ad026ec339d98f3b94995a7847285f64c04bc04d5b9b1b3e3c2cbae91"} Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.061088 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qtkp" event={"ID":"2cef084e-8345-4b18-ade1-4cc6e9fbfd09","Type":"ContainerStarted","Data":"344c33b97e18935f1cbf913477a3d0add281956c5629bb4bf4ab4ee8b565822f"} Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.066008 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-cb5lw" podStartSLOduration=15.065994516 podStartE2EDuration="15.065994516s" podCreationTimestamp="2026-01-31 09:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:27:39.064516062 +0000 UTC m=+155.035908069" watchObservedRunningTime="2026-01-31 09:27:39.065994516 +0000 UTC m=+155.037386503" Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.194646 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.406696 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.507630 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1688efda-4764-4e8b-b5cf-5544ef6edad8-kubelet-dir\") pod \"1688efda-4764-4e8b-b5cf-5544ef6edad8\" (UID: \"1688efda-4764-4e8b-b5cf-5544ef6edad8\") " Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.507770 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1688efda-4764-4e8b-b5cf-5544ef6edad8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1688efda-4764-4e8b-b5cf-5544ef6edad8" (UID: "1688efda-4764-4e8b-b5cf-5544ef6edad8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.507834 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1688efda-4764-4e8b-b5cf-5544ef6edad8-kube-api-access\") pod \"1688efda-4764-4e8b-b5cf-5544ef6edad8\" (UID: \"1688efda-4764-4e8b-b5cf-5544ef6edad8\") " Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.508105 4992 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1688efda-4764-4e8b-b5cf-5544ef6edad8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.532496 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1688efda-4764-4e8b-b5cf-5544ef6edad8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1688efda-4764-4e8b-b5cf-5544ef6edad8" (UID: "1688efda-4764-4e8b-b5cf-5544ef6edad8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.609487 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1688efda-4764-4e8b-b5cf-5544ef6edad8-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.941940 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:39 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:39 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:39 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:39 crc kubenswrapper[4992]: I0131 09:27:39.942033 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:40 crc kubenswrapper[4992]: I0131 09:27:40.094889 4992 generic.go:334] "Generic (PLEG): container finished" podID="6a074f4f-b7f6-4892-be89-083d619c0771" containerID="702ae5728dd8474c0c89693eef5d925a36bae8457fd4f04f05b88ade995e7e9b" exitCode=0 Jan 31 09:27:40 crc kubenswrapper[4992]: I0131 09:27:40.094986 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pt48t" event={"ID":"6a074f4f-b7f6-4892-be89-083d619c0771","Type":"ContainerDied","Data":"702ae5728dd8474c0c89693eef5d925a36bae8457fd4f04f05b88ade995e7e9b"} Jan 31 09:27:40 crc kubenswrapper[4992]: I0131 09:27:40.098680 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:27:40 crc kubenswrapper[4992]: I0131 09:27:40.098691 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1688efda-4764-4e8b-b5cf-5544ef6edad8","Type":"ContainerDied","Data":"cfd918e31b9435feffa2c2ce58426767344b384c8a967c5af4f0897c2a89e216"} Jan 31 09:27:40 crc kubenswrapper[4992]: I0131 09:27:40.098726 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfd918e31b9435feffa2c2ce58426767344b384c8a967c5af4f0897c2a89e216" Jan 31 09:27:40 crc kubenswrapper[4992]: I0131 09:27:40.101731 4992 generic.go:334] "Generic (PLEG): container finished" podID="de317207-ef37-4247-9d1d-279570141ebc" containerID="8efb2be9790413a51a2acd66b07b3b93c1f5d281f451f9a0a9a256e6d5cbce3e" exitCode=0 Jan 31 09:27:40 crc kubenswrapper[4992]: I0131 09:27:40.101797 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xmbj" event={"ID":"de317207-ef37-4247-9d1d-279570141ebc","Type":"ContainerDied","Data":"8efb2be9790413a51a2acd66b07b3b93c1f5d281f451f9a0a9a256e6d5cbce3e"} Jan 31 09:27:40 crc kubenswrapper[4992]: I0131 09:27:40.104678 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b45f463b-e11b-4b00-b459-4c401197e9c5","Type":"ContainerStarted","Data":"5c78b2807f2ca092079ef1aba7fefb4c21dab45c8f04bfba2d261df1ddfa8bb6"} Jan 31 09:27:40 crc kubenswrapper[4992]: I0131 09:27:40.941863 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:40 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:40 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:40 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:40 crc kubenswrapper[4992]: I0131 09:27:40.942115 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:41 crc kubenswrapper[4992]: I0131 09:27:41.124244 4992 generic.go:334] "Generic (PLEG): container finished" podID="b45f463b-e11b-4b00-b459-4c401197e9c5" containerID="5c78b2807f2ca092079ef1aba7fefb4c21dab45c8f04bfba2d261df1ddfa8bb6" exitCode=0 Jan 31 09:27:41 crc kubenswrapper[4992]: I0131 09:27:41.124330 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b45f463b-e11b-4b00-b459-4c401197e9c5","Type":"ContainerDied","Data":"5c78b2807f2ca092079ef1aba7fefb4c21dab45c8f04bfba2d261df1ddfa8bb6"} Jan 31 09:27:41 crc kubenswrapper[4992]: I0131 09:27:41.601408 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:41 crc kubenswrapper[4992]: I0131 09:27:41.606100 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-68zwk" Jan 31 09:27:41 crc kubenswrapper[4992]: I0131 09:27:41.950925 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:41 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:41 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:41 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:41 crc kubenswrapper[4992]: I0131 09:27:41.951002 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:42 crc kubenswrapper[4992]: I0131 09:27:42.363918 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2bwrf" Jan 31 09:27:42 crc kubenswrapper[4992]: I0131 09:27:42.943366 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:42 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:42 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:42 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:42 crc kubenswrapper[4992]: I0131 09:27:42.943768 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:43 crc kubenswrapper[4992]: I0131 09:27:43.163665 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-kmv44_eb9a2d0a-5c18-44d4-aa62-922d1937a7a4/cluster-samples-operator/0.log" Jan 31 09:27:43 crc kubenswrapper[4992]: I0131 09:27:43.163788 4992 generic.go:334] "Generic (PLEG): container finished" podID="eb9a2d0a-5c18-44d4-aa62-922d1937a7a4" containerID="9adf6776d9a34392d8efba06cb1d2af15a672210d868a1c75249ec0b6460ba0b" exitCode=2 Jan 31 09:27:43 crc kubenswrapper[4992]: I0131 09:27:43.163871 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44" event={"ID":"eb9a2d0a-5c18-44d4-aa62-922d1937a7a4","Type":"ContainerDied","Data":"9adf6776d9a34392d8efba06cb1d2af15a672210d868a1c75249ec0b6460ba0b"} Jan 31 09:27:43 crc kubenswrapper[4992]: I0131 09:27:43.165095 4992 scope.go:117] "RemoveContainer" containerID="9adf6776d9a34392d8efba06cb1d2af15a672210d868a1c75249ec0b6460ba0b" Jan 31 09:27:43 crc kubenswrapper[4992]: I0131 09:27:43.943212 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:43 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:43 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:43 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:43 crc kubenswrapper[4992]: I0131 09:27:43.943268 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:44 crc kubenswrapper[4992]: I0131 09:27:44.943559 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:44 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:44 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:44 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:44 crc kubenswrapper[4992]: I0131 09:27:44.943630 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:45 crc kubenswrapper[4992]: I0131 09:27:45.302003 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:27:45 crc kubenswrapper[4992]: I0131 09:27:45.302474 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:27:45 crc kubenswrapper[4992]: I0131 09:27:45.941308 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:45 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:45 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:45 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:45 crc kubenswrapper[4992]: I0131 09:27:45.941391 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:46 crc kubenswrapper[4992]: I0131 09:27:46.865027 4992 patch_prober.go:28] interesting pod/downloads-7954f5f757-jgrjj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 31 09:27:46 crc kubenswrapper[4992]: I0131 09:27:46.865169 4992 patch_prober.go:28] interesting pod/downloads-7954f5f757-jgrjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 31 09:27:46 crc kubenswrapper[4992]: I0131 09:27:46.865537 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jgrjj" podUID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 31 09:27:46 crc kubenswrapper[4992]: I0131 09:27:46.865647 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jgrjj" podUID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 31 09:27:46 crc kubenswrapper[4992]: I0131 09:27:46.942165 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:46 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:46 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:46 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:46 crc kubenswrapper[4992]: I0131 09:27:46.942221 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:47 crc kubenswrapper[4992]: I0131 09:27:47.135493 4992 patch_prober.go:28] interesting pod/console-f9d7485db-7bjlw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 31 09:27:47 crc kubenswrapper[4992]: I0131 09:27:47.135602 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7bjlw" podUID="ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 31 09:27:47 crc kubenswrapper[4992]: I0131 09:27:47.941878 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:47 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:47 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:47 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:47 crc kubenswrapper[4992]: I0131 09:27:47.941941 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:48 crc kubenswrapper[4992]: I0131 09:27:48.941910 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:48 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:48 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:48 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:48 crc kubenswrapper[4992]: I0131 09:27:48.941972 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:49 crc kubenswrapper[4992]: I0131 09:27:49.203439 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs\") pod \"network-metrics-daemon-bplq6\" (UID: \"afb1d129-e6bb-4db2-8204-3a1f4d91048e\") " pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:27:49 crc kubenswrapper[4992]: I0131 09:27:49.213071 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afb1d129-e6bb-4db2-8204-3a1f4d91048e-metrics-certs\") pod \"network-metrics-daemon-bplq6\" (UID: \"afb1d129-e6bb-4db2-8204-3a1f4d91048e\") " pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:27:49 crc kubenswrapper[4992]: I0131 09:27:49.413875 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bplq6" Jan 31 09:27:49 crc kubenswrapper[4992]: I0131 09:27:49.883144 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:27:49 crc kubenswrapper[4992]: I0131 09:27:49.948907 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:49 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:49 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:49 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:49 crc kubenswrapper[4992]: I0131 09:27:49.949018 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:50 crc kubenswrapper[4992]: I0131 09:27:50.046970 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b45f463b-e11b-4b00-b459-4c401197e9c5-kube-api-access\") pod \"b45f463b-e11b-4b00-b459-4c401197e9c5\" (UID: \"b45f463b-e11b-4b00-b459-4c401197e9c5\") " Jan 31 09:27:50 crc kubenswrapper[4992]: I0131 09:27:50.048839 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b45f463b-e11b-4b00-b459-4c401197e9c5-kubelet-dir\") pod \"b45f463b-e11b-4b00-b459-4c401197e9c5\" (UID: \"b45f463b-e11b-4b00-b459-4c401197e9c5\") " Jan 31 09:27:50 crc kubenswrapper[4992]: I0131 09:27:50.049764 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b45f463b-e11b-4b00-b459-4c401197e9c5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b45f463b-e11b-4b00-b459-4c401197e9c5" (UID: "b45f463b-e11b-4b00-b459-4c401197e9c5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:27:50 crc kubenswrapper[4992]: I0131 09:27:50.053654 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b45f463b-e11b-4b00-b459-4c401197e9c5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b45f463b-e11b-4b00-b459-4c401197e9c5" (UID: "b45f463b-e11b-4b00-b459-4c401197e9c5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:27:50 crc kubenswrapper[4992]: I0131 09:27:50.151446 4992 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b45f463b-e11b-4b00-b459-4c401197e9c5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:27:50 crc kubenswrapper[4992]: I0131 09:27:50.151494 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b45f463b-e11b-4b00-b459-4c401197e9c5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:27:50 crc kubenswrapper[4992]: I0131 09:27:50.214113 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b45f463b-e11b-4b00-b459-4c401197e9c5","Type":"ContainerDied","Data":"437327141d34cf287b38454796bf57e4aa1b4a42c65e4baba88c18ff10e9d7ca"} Jan 31 09:27:50 crc kubenswrapper[4992]: I0131 09:27:50.214149 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="437327141d34cf287b38454796bf57e4aa1b4a42c65e4baba88c18ff10e9d7ca" Jan 31 09:27:50 crc kubenswrapper[4992]: I0131 09:27:50.214186 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:27:50 crc kubenswrapper[4992]: I0131 09:27:50.941255 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:50 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:50 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:50 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:50 crc kubenswrapper[4992]: I0131 09:27:50.941706 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:51 crc kubenswrapper[4992]: I0131 09:27:51.944134 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:51 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:51 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:51 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:51 crc kubenswrapper[4992]: I0131 09:27:51.944663 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:52 crc kubenswrapper[4992]: I0131 09:27:52.188848 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bplq6"] Jan 31 09:27:52 crc kubenswrapper[4992]: W0131 09:27:52.201733 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb1d129_e6bb_4db2_8204_3a1f4d91048e.slice/crio-f61b6de3c932047b5e49443998d01a4aa48cb5cd40c54ed02307e43f39693b2b WatchSource:0}: Error finding container f61b6de3c932047b5e49443998d01a4aa48cb5cd40c54ed02307e43f39693b2b: Status 404 returned error can't find the container with id f61b6de3c932047b5e49443998d01a4aa48cb5cd40c54ed02307e43f39693b2b Jan 31 09:27:52 crc kubenswrapper[4992]: I0131 09:27:52.226732 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bplq6" event={"ID":"afb1d129-e6bb-4db2-8204-3a1f4d91048e","Type":"ContainerStarted","Data":"f61b6de3c932047b5e49443998d01a4aa48cb5cd40c54ed02307e43f39693b2b"} Jan 31 09:27:52 crc kubenswrapper[4992]: I0131 09:27:52.948123 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:52 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:52 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:52 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:52 crc kubenswrapper[4992]: I0131 09:27:52.948212 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:53 crc kubenswrapper[4992]: I0131 09:27:53.237981 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-kmv44_eb9a2d0a-5c18-44d4-aa62-922d1937a7a4/cluster-samples-operator/0.log" Jan 31 09:27:53 crc kubenswrapper[4992]: I0131 09:27:53.238063 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kmv44" event={"ID":"eb9a2d0a-5c18-44d4-aa62-922d1937a7a4","Type":"ContainerStarted","Data":"484bc57f32f6a7cf45e80bc1f7d53a74b0f0b2a10a4be417b91ebf98244f74ef"} Jan 31 09:27:53 crc kubenswrapper[4992]: I0131 09:27:53.942401 4992 patch_prober.go:28] interesting pod/router-default-5444994796-8vlmm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:27:53 crc kubenswrapper[4992]: [-]has-synced failed: reason withheld Jan 31 09:27:53 crc kubenswrapper[4992]: [+]process-running ok Jan 31 09:27:53 crc kubenswrapper[4992]: healthz check failed Jan 31 09:27:53 crc kubenswrapper[4992]: I0131 09:27:53.942506 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8vlmm" podUID="b07fa7cc-f0a6-4ef1-b858-ce216d0eaef6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:27:54 crc kubenswrapper[4992]: I0131 09:27:54.294135 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-srdfh"] Jan 31 09:27:54 crc kubenswrapper[4992]: I0131 09:27:54.294683 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" podUID="f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" containerName="controller-manager" containerID="cri-o://5ff37b9426106bb1df8796a01750e6bac347fb5a39840af90789468ee82dd76a" gracePeriod=30 Jan 31 09:27:54 crc kubenswrapper[4992]: I0131 09:27:54.323470 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv"] Jan 31 09:27:54 crc kubenswrapper[4992]: I0131 09:27:54.323741 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" podUID="3c397c46-5579-414a-aca9-3822b9e603ea" containerName="route-controller-manager" containerID="cri-o://60007e6d3b815848c7fb9e8e80a70c610d4e6f3dc77a4acbe2c99127fbfe39aa" gracePeriod=30 Jan 31 09:27:54 crc kubenswrapper[4992]: I0131 09:27:54.942977 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:54 crc kubenswrapper[4992]: I0131 09:27:54.944874 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-8vlmm" Jan 31 09:27:55 crc kubenswrapper[4992]: I0131 09:27:55.250532 4992 generic.go:334] "Generic (PLEG): container finished" podID="f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" containerID="5ff37b9426106bb1df8796a01750e6bac347fb5a39840af90789468ee82dd76a" exitCode=0 Jan 31 09:27:55 crc kubenswrapper[4992]: I0131 09:27:55.250653 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" event={"ID":"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220","Type":"ContainerDied","Data":"5ff37b9426106bb1df8796a01750e6bac347fb5a39840af90789468ee82dd76a"} Jan 31 09:27:56 crc kubenswrapper[4992]: I0131 09:27:56.256895 4992 generic.go:334] "Generic (PLEG): container finished" podID="3c397c46-5579-414a-aca9-3822b9e603ea" containerID="60007e6d3b815848c7fb9e8e80a70c610d4e6f3dc77a4acbe2c99127fbfe39aa" exitCode=0 Jan 31 09:27:56 crc kubenswrapper[4992]: I0131 09:27:56.256944 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" event={"ID":"3c397c46-5579-414a-aca9-3822b9e603ea","Type":"ContainerDied","Data":"60007e6d3b815848c7fb9e8e80a70c610d4e6f3dc77a4acbe2c99127fbfe39aa"} Jan 31 09:27:56 crc kubenswrapper[4992]: I0131 09:27:56.412769 4992 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bcggv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Jan 31 09:27:56 crc kubenswrapper[4992]: I0131 09:27:56.412819 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" podUID="3c397c46-5579-414a-aca9-3822b9e603ea" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Jan 31 09:27:56 crc kubenswrapper[4992]: I0131 09:27:56.761962 4992 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-srdfh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 31 09:27:56 crc kubenswrapper[4992]: I0131 09:27:56.762032 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" podUID="f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 31 09:27:56 crc kubenswrapper[4992]: I0131 09:27:56.861783 4992 patch_prober.go:28] interesting pod/downloads-7954f5f757-jgrjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 31 09:27:56 crc kubenswrapper[4992]: I0131 09:27:56.861842 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jgrjj" podUID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 31 09:27:56 crc kubenswrapper[4992]: I0131 09:27:56.861865 4992 patch_prober.go:28] interesting pod/downloads-7954f5f757-jgrjj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 31 09:27:56 crc kubenswrapper[4992]: I0131 09:27:56.861942 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jgrjj" podUID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 31 09:27:56 crc kubenswrapper[4992]: I0131 09:27:56.862019 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-jgrjj" Jan 31 09:27:56 crc kubenswrapper[4992]: I0131 09:27:56.862560 4992 patch_prober.go:28] interesting pod/downloads-7954f5f757-jgrjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 31 09:27:56 crc kubenswrapper[4992]: I0131 09:27:56.862588 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jgrjj" podUID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 31 09:27:56 crc kubenswrapper[4992]: I0131 09:27:56.862909 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"25eeff866992b77a0acc0b99d7fc52c1e7f84a70bfddcdc55186a0e4c6ceab85"} pod="openshift-console/downloads-7954f5f757-jgrjj" containerMessage="Container download-server failed liveness probe, will be restarted" Jan 31 09:27:56 crc kubenswrapper[4992]: I0131 09:27:56.863040 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-jgrjj" podUID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerName="download-server" containerID="cri-o://25eeff866992b77a0acc0b99d7fc52c1e7f84a70bfddcdc55186a0e4c6ceab85" gracePeriod=2 Jan 31 09:27:57 crc kubenswrapper[4992]: I0131 09:27:57.134488 4992 patch_prober.go:28] interesting pod/console-f9d7485db-7bjlw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 31 09:27:57 crc kubenswrapper[4992]: I0131 09:27:57.134747 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7bjlw" podUID="ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 31 09:27:57 crc kubenswrapper[4992]: I0131 09:27:57.264487 4992 generic.go:334] "Generic (PLEG): container finished" podID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerID="25eeff866992b77a0acc0b99d7fc52c1e7f84a70bfddcdc55186a0e4c6ceab85" exitCode=0 Jan 31 09:27:57 crc kubenswrapper[4992]: I0131 09:27:57.264577 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jgrjj" event={"ID":"dd243542-ca16-4b95-9fa1-b579ee3cca2e","Type":"ContainerDied","Data":"25eeff866992b77a0acc0b99d7fc52c1e7f84a70bfddcdc55186a0e4c6ceab85"} Jan 31 09:27:58 crc kubenswrapper[4992]: I0131 09:27:58.051583 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:28:06 crc kubenswrapper[4992]: I0131 09:28:06.412298 4992 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bcggv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Jan 31 09:28:06 crc kubenswrapper[4992]: I0131 09:28:06.412532 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" podUID="3c397c46-5579-414a-aca9-3822b9e603ea" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Jan 31 09:28:06 crc kubenswrapper[4992]: I0131 09:28:06.761763 4992 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-srdfh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 31 09:28:06 crc kubenswrapper[4992]: I0131 09:28:06.762118 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" podUID="f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 31 09:28:06 crc kubenswrapper[4992]: I0131 09:28:06.864249 4992 patch_prober.go:28] interesting pod/downloads-7954f5f757-jgrjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 31 09:28:06 crc kubenswrapper[4992]: I0131 09:28:06.864328 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jgrjj" podUID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 31 09:28:07 crc kubenswrapper[4992]: I0131 09:28:07.138376 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:28:07 crc kubenswrapper[4992]: I0131 09:28:07.141978 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:28:07 crc kubenswrapper[4992]: I0131 09:28:07.383970 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b99x8" Jan 31 09:28:11 crc kubenswrapper[4992]: E0131 09:28:11.276126 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 31 09:28:11 crc kubenswrapper[4992]: E0131 09:28:11.277711 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdpxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-9xmbj_openshift-marketplace(de317207-ef37-4247-9d1d-279570141ebc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 09:28:11 crc kubenswrapper[4992]: E0131 09:28:11.279022 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-9xmbj" podUID="de317207-ef37-4247-9d1d-279570141ebc" Jan 31 09:28:13 crc kubenswrapper[4992]: I0131 09:28:13.653578 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:28:15 crc kubenswrapper[4992]: I0131 09:28:15.301834 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:28:15 crc kubenswrapper[4992]: I0131 09:28:15.302670 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:28:16 crc kubenswrapper[4992]: E0131 09:28:16.190262 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-9xmbj" podUID="de317207-ef37-4247-9d1d-279570141ebc" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.411496 4992 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bcggv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.411798 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" podUID="3c397c46-5579-414a-aca9-3822b9e603ea" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.562290 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 31 09:28:16 crc kubenswrapper[4992]: E0131 09:28:16.562549 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1688efda-4764-4e8b-b5cf-5544ef6edad8" containerName="pruner" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.562563 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1688efda-4764-4e8b-b5cf-5544ef6edad8" containerName="pruner" Jan 31 09:28:16 crc kubenswrapper[4992]: E0131 09:28:16.562587 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45f463b-e11b-4b00-b459-4c401197e9c5" containerName="pruner" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.562596 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45f463b-e11b-4b00-b459-4c401197e9c5" containerName="pruner" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.562709 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="1688efda-4764-4e8b-b5cf-5544ef6edad8" containerName="pruner" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.562722 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45f463b-e11b-4b00-b459-4c401197e9c5" containerName="pruner" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.563142 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.565484 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.565862 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.573076 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.713221 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/865f17ae-618a-4f0a-b79b-3da46e3ea9c9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"865f17ae-618a-4f0a-b79b-3da46e3ea9c9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.713376 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/865f17ae-618a-4f0a-b79b-3da46e3ea9c9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"865f17ae-618a-4f0a-b79b-3da46e3ea9c9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.762189 4992 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-srdfh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.762266 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" podUID="f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.814469 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/865f17ae-618a-4f0a-b79b-3da46e3ea9c9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"865f17ae-618a-4f0a-b79b-3da46e3ea9c9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.814583 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/865f17ae-618a-4f0a-b79b-3da46e3ea9c9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"865f17ae-618a-4f0a-b79b-3da46e3ea9c9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.814646 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/865f17ae-618a-4f0a-b79b-3da46e3ea9c9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"865f17ae-618a-4f0a-b79b-3da46e3ea9c9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.832980 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/865f17ae-618a-4f0a-b79b-3da46e3ea9c9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"865f17ae-618a-4f0a-b79b-3da46e3ea9c9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.862182 4992 patch_prober.go:28] interesting pod/downloads-7954f5f757-jgrjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.862290 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jgrjj" podUID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 31 09:28:16 crc kubenswrapper[4992]: I0131 09:28:16.897327 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:28:18 crc kubenswrapper[4992]: E0131 09:28:18.353792 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 31 09:28:18 crc kubenswrapper[4992]: E0131 09:28:18.354597 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ctmrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-j9mrh_openshift-marketplace(0377ed6d-ea6e-44cb-9d09-0c817af64b22): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 09:28:18 crc kubenswrapper[4992]: E0131 09:28:18.355839 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-j9mrh" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" Jan 31 09:28:18 crc kubenswrapper[4992]: I0131 09:28:18.376511 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bplq6" event={"ID":"afb1d129-e6bb-4db2-8204-3a1f4d91048e","Type":"ContainerStarted","Data":"7a4cbc9e40594974e6421ae5905494931fcad01f1ef84e215c0376a839348726"} Jan 31 09:28:20 crc kubenswrapper[4992]: E0131 09:28:20.396926 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 31 09:28:20 crc kubenswrapper[4992]: E0131 09:28:20.397160 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zr2jx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2b45w_openshift-marketplace(1cd981c2-7781-4f04-8ef1-73219837a007): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 09:28:20 crc kubenswrapper[4992]: E0131 09:28:20.399160 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2b45w" podUID="1cd981c2-7781-4f04-8ef1-73219837a007" Jan 31 09:28:20 crc kubenswrapper[4992]: I0131 09:28:20.763496 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 31 09:28:20 crc kubenswrapper[4992]: I0131 09:28:20.768172 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:28:20 crc kubenswrapper[4992]: I0131 09:28:20.786225 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 31 09:28:20 crc kubenswrapper[4992]: I0131 09:28:20.873517 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:28:20 crc kubenswrapper[4992]: I0131 09:28:20.873834 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-var-lock\") pod \"installer-9-crc\" (UID: \"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:28:20 crc kubenswrapper[4992]: I0131 09:28:20.873909 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-kube-api-access\") pod \"installer-9-crc\" (UID: \"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:28:20 crc kubenswrapper[4992]: E0131 09:28:20.887750 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 31 09:28:20 crc kubenswrapper[4992]: E0131 09:28:20.887883 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k9nwc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-4drj6_openshift-marketplace(7c4d6b90-976c-46f4-b55f-26d3277cc754): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 09:28:20 crc kubenswrapper[4992]: E0131 09:28:20.889068 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-4drj6" podUID="7c4d6b90-976c-46f4-b55f-26d3277cc754" Jan 31 09:28:20 crc kubenswrapper[4992]: I0131 09:28:20.975311 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-kube-api-access\") pod \"installer-9-crc\" (UID: \"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:28:20 crc kubenswrapper[4992]: I0131 09:28:20.975377 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:28:20 crc kubenswrapper[4992]: I0131 09:28:20.975394 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-var-lock\") pod \"installer-9-crc\" (UID: \"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:28:20 crc kubenswrapper[4992]: I0131 09:28:20.975497 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-var-lock\") pod \"installer-9-crc\" (UID: \"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:28:20 crc kubenswrapper[4992]: I0131 09:28:20.975794 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:28:20 crc kubenswrapper[4992]: I0131 09:28:20.995612 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-kube-api-access\") pod \"installer-9-crc\" (UID: \"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:28:21 crc kubenswrapper[4992]: I0131 09:28:21.091143 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:28:26 crc kubenswrapper[4992]: E0131 09:28:26.177680 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-j9mrh" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" Jan 31 09:28:26 crc kubenswrapper[4992]: E0131 09:28:26.177983 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2b45w" podUID="1cd981c2-7781-4f04-8ef1-73219837a007" Jan 31 09:28:26 crc kubenswrapper[4992]: E0131 09:28:26.177679 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-4drj6" podUID="7c4d6b90-976c-46f4-b55f-26d3277cc754" Jan 31 09:28:26 crc kubenswrapper[4992]: I0131 09:28:26.414672 4992 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bcggv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Jan 31 09:28:26 crc kubenswrapper[4992]: I0131 09:28:26.414731 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" podUID="3c397c46-5579-414a-aca9-3822b9e603ea" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Jan 31 09:28:26 crc kubenswrapper[4992]: I0131 09:28:26.861897 4992 patch_prober.go:28] interesting pod/downloads-7954f5f757-jgrjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 31 09:28:26 crc kubenswrapper[4992]: I0131 09:28:26.861950 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jgrjj" podUID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 31 09:28:27 crc kubenswrapper[4992]: I0131 09:28:27.760991 4992 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-srdfh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 09:28:27 crc kubenswrapper[4992]: I0131 09:28:27.761073 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" podUID="f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 31 09:28:30 crc kubenswrapper[4992]: E0131 09:28:30.031135 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 31 09:28:30 crc kubenswrapper[4992]: E0131 09:28:30.031924 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kzp28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-pt48t_openshift-marketplace(6a074f4f-b7f6-4892-be89-083d619c0771): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 09:28:30 crc kubenswrapper[4992]: E0131 09:28:30.033195 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-pt48t" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.106953 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:28:30 crc kubenswrapper[4992]: E0131 09:28:30.110127 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 31 09:28:30 crc kubenswrapper[4992]: E0131 09:28:30.110471 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5vmvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rmbpb_openshift-marketplace(4aab422a-915a-4fd8-a9f2-3f04bdaee9da): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 09:28:30 crc kubenswrapper[4992]: E0131 09:28:30.112823 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rmbpb" podUID="4aab422a-915a-4fd8-a9f2-3f04bdaee9da" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.147852 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f998f6c97-kmgqq"] Jan 31 09:28:30 crc kubenswrapper[4992]: E0131 09:28:30.148142 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" containerName="controller-manager" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.148157 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" containerName="controller-manager" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.148277 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" containerName="controller-manager" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.148732 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.154922 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f998f6c97-kmgqq"] Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.198140 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bhhg\" (UniqueName: \"kubernetes.io/projected/a505a892-3153-4dfe-abdb-c83998d795c1-kube-api-access-2bhhg\") pod \"controller-manager-5f998f6c97-kmgqq\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.198379 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-client-ca\") pod \"controller-manager-5f998f6c97-kmgqq\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.198512 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a505a892-3153-4dfe-abdb-c83998d795c1-serving-cert\") pod \"controller-manager-5f998f6c97-kmgqq\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.198598 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-config\") pod \"controller-manager-5f998f6c97-kmgqq\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.198631 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-proxy-ca-bundles\") pod \"controller-manager-5f998f6c97-kmgqq\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: E0131 09:28:30.234191 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 31 09:28:30 crc kubenswrapper[4992]: E0131 09:28:30.234389 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rb5wd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-b7j8l_openshift-marketplace(533c10ab-faa7-4a62-8e8a-2ebd87578ced): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 09:28:30 crc kubenswrapper[4992]: E0131 09:28:30.235945 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-b7j8l" podUID="533c10ab-faa7-4a62-8e8a-2ebd87578ced" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.296541 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.299340 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-config\") pod \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.299441 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgv4k\" (UniqueName: \"kubernetes.io/projected/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-kube-api-access-qgv4k\") pod \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.299483 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-serving-cert\") pod \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.299513 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-proxy-ca-bundles\") pod \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.300441 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-config" (OuterVolumeSpecName: "config") pod "f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" (UID: "f1a9d9bf-4e63-41da-87c9-05bbfe1b3220"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.300866 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" (UID: "f1a9d9bf-4e63-41da-87c9-05bbfe1b3220"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.300967 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-client-ca\") pod \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\" (UID: \"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220\") " Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.300996 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c397c46-5579-414a-aca9-3822b9e603ea-serving-cert\") pod \"3c397c46-5579-414a-aca9-3822b9e603ea\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.301152 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a505a892-3153-4dfe-abdb-c83998d795c1-serving-cert\") pod \"controller-manager-5f998f6c97-kmgqq\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.301196 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-config\") pod \"controller-manager-5f998f6c97-kmgqq\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.301218 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-proxy-ca-bundles\") pod \"controller-manager-5f998f6c97-kmgqq\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.301262 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bhhg\" (UniqueName: \"kubernetes.io/projected/a505a892-3153-4dfe-abdb-c83998d795c1-kube-api-access-2bhhg\") pod \"controller-manager-5f998f6c97-kmgqq\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.301284 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-client-ca\") pod \"controller-manager-5f998f6c97-kmgqq\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.301297 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-client-ca" (OuterVolumeSpecName: "client-ca") pod "f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" (UID: "f1a9d9bf-4e63-41da-87c9-05bbfe1b3220"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.301328 4992 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.301338 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.302621 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-client-ca\") pod \"controller-manager-5f998f6c97-kmgqq\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.304701 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-kube-api-access-qgv4k" (OuterVolumeSpecName: "kube-api-access-qgv4k") pod "f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" (UID: "f1a9d9bf-4e63-41da-87c9-05bbfe1b3220"). InnerVolumeSpecName "kube-api-access-qgv4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.305532 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-proxy-ca-bundles\") pod \"controller-manager-5f998f6c97-kmgqq\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.306156 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" (UID: "f1a9d9bf-4e63-41da-87c9-05bbfe1b3220"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.306721 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c397c46-5579-414a-aca9-3822b9e603ea-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3c397c46-5579-414a-aca9-3822b9e603ea" (UID: "3c397c46-5579-414a-aca9-3822b9e603ea"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.310108 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a505a892-3153-4dfe-abdb-c83998d795c1-serving-cert\") pod \"controller-manager-5f998f6c97-kmgqq\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.316094 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-config\") pod \"controller-manager-5f998f6c97-kmgqq\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.322020 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bhhg\" (UniqueName: \"kubernetes.io/projected/a505a892-3153-4dfe-abdb-c83998d795c1-kube-api-access-2bhhg\") pod \"controller-manager-5f998f6c97-kmgqq\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.401684 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c397c46-5579-414a-aca9-3822b9e603ea-config\") pod \"3c397c46-5579-414a-aca9-3822b9e603ea\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.401761 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2g5j\" (UniqueName: \"kubernetes.io/projected/3c397c46-5579-414a-aca9-3822b9e603ea-kube-api-access-m2g5j\") pod \"3c397c46-5579-414a-aca9-3822b9e603ea\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.401794 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c397c46-5579-414a-aca9-3822b9e603ea-client-ca\") pod \"3c397c46-5579-414a-aca9-3822b9e603ea\" (UID: \"3c397c46-5579-414a-aca9-3822b9e603ea\") " Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.401964 4992 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.401975 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c397c46-5579-414a-aca9-3822b9e603ea-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.401984 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgv4k\" (UniqueName: \"kubernetes.io/projected/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-kube-api-access-qgv4k\") on node \"crc\" DevicePath \"\"" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.401993 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.402495 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c397c46-5579-414a-aca9-3822b9e603ea-client-ca" (OuterVolumeSpecName: "client-ca") pod "3c397c46-5579-414a-aca9-3822b9e603ea" (UID: "3c397c46-5579-414a-aca9-3822b9e603ea"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.402515 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c397c46-5579-414a-aca9-3822b9e603ea-config" (OuterVolumeSpecName: "config") pod "3c397c46-5579-414a-aca9-3822b9e603ea" (UID: "3c397c46-5579-414a-aca9-3822b9e603ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.408821 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c397c46-5579-414a-aca9-3822b9e603ea-kube-api-access-m2g5j" (OuterVolumeSpecName: "kube-api-access-m2g5j") pod "3c397c46-5579-414a-aca9-3822b9e603ea" (UID: "3c397c46-5579-414a-aca9-3822b9e603ea"). InnerVolumeSpecName "kube-api-access-m2g5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.457026 4992 generic.go:334] "Generic (PLEG): container finished" podID="2cef084e-8345-4b18-ade1-4cc6e9fbfd09" containerID="4ce74399488f1e5ead804bf88ad0d77ab22df4d92da4f1e6c3adcdd63a632172" exitCode=0 Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.457223 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qtkp" event={"ID":"2cef084e-8345-4b18-ade1-4cc6e9fbfd09","Type":"ContainerDied","Data":"4ce74399488f1e5ead804bf88ad0d77ab22df4d92da4f1e6c3adcdd63a632172"} Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.461636 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" event={"ID":"f1a9d9bf-4e63-41da-87c9-05bbfe1b3220","Type":"ContainerDied","Data":"932699a81f28fce7116e7a63e79aec77f5e92242b008bf75611e4401943a78dd"} Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.461687 4992 scope.go:117] "RemoveContainer" containerID="5ff37b9426106bb1df8796a01750e6bac347fb5a39840af90789468ee82dd76a" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.461797 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-srdfh" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.465939 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bplq6" event={"ID":"afb1d129-e6bb-4db2-8204-3a1f4d91048e","Type":"ContainerStarted","Data":"3bee2a21ee1ed607d03d0ca2dbe084f59677177c15829e56a6ff8b5035858ecb"} Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.473281 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jgrjj" event={"ID":"dd243542-ca16-4b95-9fa1-b579ee3cca2e","Type":"ContainerStarted","Data":"f28d9941d65de63b48beb16ab806e933fa633cad36b59564e65f1963596d7e48"} Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.473882 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-jgrjj" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.474165 4992 patch_prober.go:28] interesting pod/downloads-7954f5f757-jgrjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.474201 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jgrjj" podUID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.476702 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" event={"ID":"3c397c46-5579-414a-aca9-3822b9e603ea","Type":"ContainerDied","Data":"6eabb2fef30805cd1ec94df1aba7ad87d4a2d5adbfc5f7b41bc26017116454e9"} Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.478137 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:30 crc kubenswrapper[4992]: E0131 09:28:30.480019 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rmbpb" podUID="4aab422a-915a-4fd8-a9f2-3f04bdaee9da" Jan 31 09:28:30 crc kubenswrapper[4992]: E0131 09:28:30.480314 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-b7j8l" podUID="533c10ab-faa7-4a62-8e8a-2ebd87578ced" Jan 31 09:28:30 crc kubenswrapper[4992]: E0131 09:28:30.480582 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-pt48t" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.481395 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.487114 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.493465 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.502996 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2g5j\" (UniqueName: \"kubernetes.io/projected/3c397c46-5579-414a-aca9-3822b9e603ea-kube-api-access-m2g5j\") on node \"crc\" DevicePath \"\"" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.503021 4992 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c397c46-5579-414a-aca9-3822b9e603ea-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.503035 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c397c46-5579-414a-aca9-3822b9e603ea-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.511099 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bplq6" podStartSLOduration=183.511079122 podStartE2EDuration="3m3.511079122s" podCreationTimestamp="2026-01-31 09:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:28:30.506571859 +0000 UTC m=+206.477963856" watchObservedRunningTime="2026-01-31 09:28:30.511079122 +0000 UTC m=+206.482471109" Jan 31 09:28:30 crc kubenswrapper[4992]: W0131 09:28:30.516016 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod865f17ae_618a_4f0a_b79b_3da46e3ea9c9.slice/crio-589a9ab4b7801ff2f4b9c8e03552eea7fce3a677baeee92b15d4930afc3fb59c WatchSource:0}: Error finding container 589a9ab4b7801ff2f4b9c8e03552eea7fce3a677baeee92b15d4930afc3fb59c: Status 404 returned error can't find the container with id 589a9ab4b7801ff2f4b9c8e03552eea7fce3a677baeee92b15d4930afc3fb59c Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.520755 4992 scope.go:117] "RemoveContainer" containerID="60007e6d3b815848c7fb9e8e80a70c610d4e6f3dc77a4acbe2c99127fbfe39aa" Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.603085 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv"] Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.608924 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bcggv"] Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.628945 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-srdfh"] Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.634361 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-srdfh"] Jan 31 09:28:30 crc kubenswrapper[4992]: I0131 09:28:30.947277 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f998f6c97-kmgqq"] Jan 31 09:28:30 crc kubenswrapper[4992]: W0131 09:28:30.953643 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda505a892_3153_4dfe_abdb_c83998d795c1.slice/crio-27e84e605c81817d5fbc50663b4b3d6ff3e41da737195224c29532ec60cb9e3b WatchSource:0}: Error finding container 27e84e605c81817d5fbc50663b4b3d6ff3e41da737195224c29532ec60cb9e3b: Status 404 returned error can't find the container with id 27e84e605c81817d5fbc50663b4b3d6ff3e41da737195224c29532ec60cb9e3b Jan 31 09:28:31 crc kubenswrapper[4992]: I0131 09:28:31.197925 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c397c46-5579-414a-aca9-3822b9e603ea" path="/var/lib/kubelet/pods/3c397c46-5579-414a-aca9-3822b9e603ea/volumes" Jan 31 09:28:31 crc kubenswrapper[4992]: I0131 09:28:31.198811 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a9d9bf-4e63-41da-87c9-05bbfe1b3220" path="/var/lib/kubelet/pods/f1a9d9bf-4e63-41da-87c9-05bbfe1b3220/volumes" Jan 31 09:28:31 crc kubenswrapper[4992]: I0131 09:28:31.487229 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0","Type":"ContainerStarted","Data":"a3d65764cc5a6d7733838c0524a39001b6f17f244be09f93dbef1134c64a8e62"} Jan 31 09:28:31 crc kubenswrapper[4992]: I0131 09:28:31.488052 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0","Type":"ContainerStarted","Data":"d7f9b391f32543f4fc00569eaf464944b74b25862ae5fff6249d24aa6954b056"} Jan 31 09:28:31 crc kubenswrapper[4992]: I0131 09:28:31.493887 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" event={"ID":"a505a892-3153-4dfe-abdb-c83998d795c1","Type":"ContainerStarted","Data":"09ab3f7b82266aa4037a7eb3692ecbe518fe10564ce1ffaebbc91da904ce1ecb"} Jan 31 09:28:31 crc kubenswrapper[4992]: I0131 09:28:31.493942 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" event={"ID":"a505a892-3153-4dfe-abdb-c83998d795c1","Type":"ContainerStarted","Data":"27e84e605c81817d5fbc50663b4b3d6ff3e41da737195224c29532ec60cb9e3b"} Jan 31 09:28:31 crc kubenswrapper[4992]: I0131 09:28:31.494219 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:31 crc kubenswrapper[4992]: I0131 09:28:31.496638 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"865f17ae-618a-4f0a-b79b-3da46e3ea9c9","Type":"ContainerStarted","Data":"1ac471460f9a5678e46bcac00b8bafd8f3ef352d00bade435edf3eaff6891491"} Jan 31 09:28:31 crc kubenswrapper[4992]: I0131 09:28:31.496676 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"865f17ae-618a-4f0a-b79b-3da46e3ea9c9","Type":"ContainerStarted","Data":"589a9ab4b7801ff2f4b9c8e03552eea7fce3a677baeee92b15d4930afc3fb59c"} Jan 31 09:28:31 crc kubenswrapper[4992]: I0131 09:28:31.503477 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:28:31 crc kubenswrapper[4992]: I0131 09:28:31.505854 4992 patch_prober.go:28] interesting pod/downloads-7954f5f757-jgrjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 31 09:28:31 crc kubenswrapper[4992]: I0131 09:28:31.505909 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jgrjj" podUID="dd243542-ca16-4b95-9fa1-b579ee3cca2e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 31 09:28:31 crc kubenswrapper[4992]: I0131 09:28:31.514882 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=11.514860033 podStartE2EDuration="11.514860033s" podCreationTimestamp="2026-01-31 09:28:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:28:31.512444431 +0000 UTC m=+207.483836418" watchObservedRunningTime="2026-01-31 09:28:31.514860033 +0000 UTC m=+207.486252010" Jan 31 09:28:31 crc kubenswrapper[4992]: I0131 09:28:31.534814 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" podStartSLOduration=17.534786823 podStartE2EDuration="17.534786823s" podCreationTimestamp="2026-01-31 09:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:28:31.533184835 +0000 UTC m=+207.504576842" watchObservedRunningTime="2026-01-31 09:28:31.534786823 +0000 UTC m=+207.506178810" Jan 31 09:28:31 crc kubenswrapper[4992]: I0131 09:28:31.563716 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=15.563687349 podStartE2EDuration="15.563687349s" podCreationTimestamp="2026-01-31 09:28:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:28:31.558440643 +0000 UTC m=+207.529832620" watchObservedRunningTime="2026-01-31 09:28:31.563687349 +0000 UTC m=+207.535079336" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.511702 4992 generic.go:334] "Generic (PLEG): container finished" podID="865f17ae-618a-4f0a-b79b-3da46e3ea9c9" containerID="1ac471460f9a5678e46bcac00b8bafd8f3ef352d00bade435edf3eaff6891491" exitCode=0 Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.511928 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"865f17ae-618a-4f0a-b79b-3da46e3ea9c9","Type":"ContainerDied","Data":"1ac471460f9a5678e46bcac00b8bafd8f3ef352d00bade435edf3eaff6891491"} Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.513889 4992 generic.go:334] "Generic (PLEG): container finished" podID="de317207-ef37-4247-9d1d-279570141ebc" containerID="24d6f56d5d7e13f415e7cf4b2f7dd19d69d56877714e1c5f80840cdef75dd50d" exitCode=0 Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.513958 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xmbj" event={"ID":"de317207-ef37-4247-9d1d-279570141ebc","Type":"ContainerDied","Data":"24d6f56d5d7e13f415e7cf4b2f7dd19d69d56877714e1c5f80840cdef75dd50d"} Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.516688 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qtkp" event={"ID":"2cef084e-8345-4b18-ade1-4cc6e9fbfd09","Type":"ContainerStarted","Data":"22291ad67814e70eb10003c0326e63a44ad4978e481eb5ebba9dc6877389b215"} Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.572567 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v"] Jan 31 09:28:32 crc kubenswrapper[4992]: E0131 09:28:32.572871 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c397c46-5579-414a-aca9-3822b9e603ea" containerName="route-controller-manager" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.572890 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c397c46-5579-414a-aca9-3822b9e603ea" containerName="route-controller-manager" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.573045 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c397c46-5579-414a-aca9-3822b9e603ea" containerName="route-controller-manager" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.573525 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.575581 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.575661 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.575830 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.575862 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.575871 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.576249 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.581954 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2qtkp" podStartSLOduration=4.215655287 podStartE2EDuration="56.581935088s" podCreationTimestamp="2026-01-31 09:27:36 +0000 UTC" firstStartedPulling="2026-01-31 09:27:39.065309606 +0000 UTC m=+155.036701593" lastFinishedPulling="2026-01-31 09:28:31.431589397 +0000 UTC m=+207.402981394" observedRunningTime="2026-01-31 09:28:32.581613868 +0000 UTC m=+208.553005875" watchObservedRunningTime="2026-01-31 09:28:32.581935088 +0000 UTC m=+208.553327075" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.591181 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v"] Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.736012 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4jj8\" (UniqueName: \"kubernetes.io/projected/b194d796-8bc3-4599-905b-16c28f19f7f4-kube-api-access-k4jj8\") pod \"route-controller-manager-96785bd75-4kl7v\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.736097 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b194d796-8bc3-4599-905b-16c28f19f7f4-client-ca\") pod \"route-controller-manager-96785bd75-4kl7v\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.736190 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b194d796-8bc3-4599-905b-16c28f19f7f4-serving-cert\") pod \"route-controller-manager-96785bd75-4kl7v\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.736227 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b194d796-8bc3-4599-905b-16c28f19f7f4-config\") pod \"route-controller-manager-96785bd75-4kl7v\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.837187 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4jj8\" (UniqueName: \"kubernetes.io/projected/b194d796-8bc3-4599-905b-16c28f19f7f4-kube-api-access-k4jj8\") pod \"route-controller-manager-96785bd75-4kl7v\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.837721 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b194d796-8bc3-4599-905b-16c28f19f7f4-client-ca\") pod \"route-controller-manager-96785bd75-4kl7v\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.837781 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b194d796-8bc3-4599-905b-16c28f19f7f4-serving-cert\") pod \"route-controller-manager-96785bd75-4kl7v\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.837807 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b194d796-8bc3-4599-905b-16c28f19f7f4-config\") pod \"route-controller-manager-96785bd75-4kl7v\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.839163 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b194d796-8bc3-4599-905b-16c28f19f7f4-config\") pod \"route-controller-manager-96785bd75-4kl7v\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.839837 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b194d796-8bc3-4599-905b-16c28f19f7f4-client-ca\") pod \"route-controller-manager-96785bd75-4kl7v\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.847940 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b194d796-8bc3-4599-905b-16c28f19f7f4-serving-cert\") pod \"route-controller-manager-96785bd75-4kl7v\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.864384 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4jj8\" (UniqueName: \"kubernetes.io/projected/b194d796-8bc3-4599-905b-16c28f19f7f4-kube-api-access-k4jj8\") pod \"route-controller-manager-96785bd75-4kl7v\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:32 crc kubenswrapper[4992]: I0131 09:28:32.891095 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:33 crc kubenswrapper[4992]: I0131 09:28:33.147326 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v"] Jan 31 09:28:33 crc kubenswrapper[4992]: W0131 09:28:33.162228 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb194d796_8bc3_4599_905b_16c28f19f7f4.slice/crio-90b61467e9ee7368035af90feb7174e4d7f59ee1394da50c84e1a601e5ae8413 WatchSource:0}: Error finding container 90b61467e9ee7368035af90feb7174e4d7f59ee1394da50c84e1a601e5ae8413: Status 404 returned error can't find the container with id 90b61467e9ee7368035af90feb7174e4d7f59ee1394da50c84e1a601e5ae8413 Jan 31 09:28:33 crc kubenswrapper[4992]: I0131 09:28:33.523486 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xmbj" event={"ID":"de317207-ef37-4247-9d1d-279570141ebc","Type":"ContainerStarted","Data":"21d7c088f7dec369814d7bf5a663c8b9dc81e84a1c87207d791830d4bbf75e3b"} Jan 31 09:28:33 crc kubenswrapper[4992]: I0131 09:28:33.525167 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" event={"ID":"b194d796-8bc3-4599-905b-16c28f19f7f4","Type":"ContainerStarted","Data":"a83bd5bd79bf123f7716fcdbdc52eca42f23dca88b6302dfe4fa8f9ccf996779"} Jan 31 09:28:33 crc kubenswrapper[4992]: I0131 09:28:33.525236 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" event={"ID":"b194d796-8bc3-4599-905b-16c28f19f7f4","Type":"ContainerStarted","Data":"90b61467e9ee7368035af90feb7174e4d7f59ee1394da50c84e1a601e5ae8413"} Jan 31 09:28:33 crc kubenswrapper[4992]: I0131 09:28:33.552383 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9xmbj" podStartSLOduration=3.709936701 podStartE2EDuration="56.55236675s" podCreationTimestamp="2026-01-31 09:27:37 +0000 UTC" firstStartedPulling="2026-01-31 09:27:40.103931418 +0000 UTC m=+156.075323405" lastFinishedPulling="2026-01-31 09:28:32.946361467 +0000 UTC m=+208.917753454" observedRunningTime="2026-01-31 09:28:33.548440584 +0000 UTC m=+209.519832591" watchObservedRunningTime="2026-01-31 09:28:33.55236675 +0000 UTC m=+209.523758737" Jan 31 09:28:33 crc kubenswrapper[4992]: I0131 09:28:33.572302 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" podStartSLOduration=19.57228155 podStartE2EDuration="19.57228155s" podCreationTimestamp="2026-01-31 09:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:28:33.568797137 +0000 UTC m=+209.540189144" watchObservedRunningTime="2026-01-31 09:28:33.57228155 +0000 UTC m=+209.543673537" Jan 31 09:28:33 crc kubenswrapper[4992]: I0131 09:28:33.827727 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:28:33 crc kubenswrapper[4992]: I0131 09:28:33.958256 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/865f17ae-618a-4f0a-b79b-3da46e3ea9c9-kubelet-dir\") pod \"865f17ae-618a-4f0a-b79b-3da46e3ea9c9\" (UID: \"865f17ae-618a-4f0a-b79b-3da46e3ea9c9\") " Jan 31 09:28:33 crc kubenswrapper[4992]: I0131 09:28:33.958399 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/865f17ae-618a-4f0a-b79b-3da46e3ea9c9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "865f17ae-618a-4f0a-b79b-3da46e3ea9c9" (UID: "865f17ae-618a-4f0a-b79b-3da46e3ea9c9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:28:33 crc kubenswrapper[4992]: I0131 09:28:33.958466 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/865f17ae-618a-4f0a-b79b-3da46e3ea9c9-kube-api-access\") pod \"865f17ae-618a-4f0a-b79b-3da46e3ea9c9\" (UID: \"865f17ae-618a-4f0a-b79b-3da46e3ea9c9\") " Jan 31 09:28:33 crc kubenswrapper[4992]: I0131 09:28:33.958748 4992 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/865f17ae-618a-4f0a-b79b-3da46e3ea9c9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:28:33 crc kubenswrapper[4992]: I0131 09:28:33.964602 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/865f17ae-618a-4f0a-b79b-3da46e3ea9c9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "865f17ae-618a-4f0a-b79b-3da46e3ea9c9" (UID: "865f17ae-618a-4f0a-b79b-3da46e3ea9c9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:28:34 crc kubenswrapper[4992]: I0131 09:28:34.059874 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/865f17ae-618a-4f0a-b79b-3da46e3ea9c9-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:28:34 crc kubenswrapper[4992]: I0131 09:28:34.534692 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"865f17ae-618a-4f0a-b79b-3da46e3ea9c9","Type":"ContainerDied","Data":"589a9ab4b7801ff2f4b9c8e03552eea7fce3a677baeee92b15d4930afc3fb59c"} Jan 31 09:28:34 crc kubenswrapper[4992]: I0131 09:28:34.534767 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:28:34 crc kubenswrapper[4992]: I0131 09:28:34.534787 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="589a9ab4b7801ff2f4b9c8e03552eea7fce3a677baeee92b15d4930afc3fb59c" Jan 31 09:28:34 crc kubenswrapper[4992]: I0131 09:28:34.534838 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:34 crc kubenswrapper[4992]: I0131 09:28:34.546621 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:28:36 crc kubenswrapper[4992]: I0131 09:28:36.877994 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-jgrjj" Jan 31 09:28:37 crc kubenswrapper[4992]: I0131 09:28:37.074004 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:28:37 crc kubenswrapper[4992]: I0131 09:28:37.074354 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:28:37 crc kubenswrapper[4992]: I0131 09:28:37.918920 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:28:37 crc kubenswrapper[4992]: I0131 09:28:37.933356 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:28:37 crc kubenswrapper[4992]: I0131 09:28:37.933409 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:28:38 crc kubenswrapper[4992]: I0131 09:28:38.617263 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:28:38 crc kubenswrapper[4992]: I0131 09:28:38.987757 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9xmbj" podUID="de317207-ef37-4247-9d1d-279570141ebc" containerName="registry-server" probeResult="failure" output=< Jan 31 09:28:38 crc kubenswrapper[4992]: timeout: failed to connect service ":50051" within 1s Jan 31 09:28:38 crc kubenswrapper[4992]: > Jan 31 09:28:44 crc kubenswrapper[4992]: I0131 09:28:44.598952 4992 generic.go:334] "Generic (PLEG): container finished" podID="1cd981c2-7781-4f04-8ef1-73219837a007" containerID="9ffc73f221bea9766c7d0c8c18e4779d4aff641a63629defaff3c01a4666966c" exitCode=0 Jan 31 09:28:44 crc kubenswrapper[4992]: I0131 09:28:44.599030 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b45w" event={"ID":"1cd981c2-7781-4f04-8ef1-73219837a007","Type":"ContainerDied","Data":"9ffc73f221bea9766c7d0c8c18e4779d4aff641a63629defaff3c01a4666966c"} Jan 31 09:28:44 crc kubenswrapper[4992]: I0131 09:28:44.602311 4992 generic.go:334] "Generic (PLEG): container finished" podID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" containerID="033da07aacf015cb005c994f281fc65c556eb398d9ca3fd91793e31f5be0ab7e" exitCode=0 Jan 31 09:28:44 crc kubenswrapper[4992]: I0131 09:28:44.602386 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9mrh" event={"ID":"0377ed6d-ea6e-44cb-9d09-0c817af64b22","Type":"ContainerDied","Data":"033da07aacf015cb005c994f281fc65c556eb398d9ca3fd91793e31f5be0ab7e"} Jan 31 09:28:44 crc kubenswrapper[4992]: I0131 09:28:44.605726 4992 generic.go:334] "Generic (PLEG): container finished" podID="7c4d6b90-976c-46f4-b55f-26d3277cc754" containerID="81713a8db7fcfe094a259706cd6ae6883b242b7e587950ba9bffdbf43c3652c1" exitCode=0 Jan 31 09:28:44 crc kubenswrapper[4992]: I0131 09:28:44.605820 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4drj6" event={"ID":"7c4d6b90-976c-46f4-b55f-26d3277cc754","Type":"ContainerDied","Data":"81713a8db7fcfe094a259706cd6ae6883b242b7e587950ba9bffdbf43c3652c1"} Jan 31 09:28:44 crc kubenswrapper[4992]: I0131 09:28:44.609270 4992 generic.go:334] "Generic (PLEG): container finished" podID="4aab422a-915a-4fd8-a9f2-3f04bdaee9da" containerID="600487d362561f7285b19c4570a61b7012fa2e3afa74918c8a8b1ed141eb434d" exitCode=0 Jan 31 09:28:44 crc kubenswrapper[4992]: I0131 09:28:44.609328 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmbpb" event={"ID":"4aab422a-915a-4fd8-a9f2-3f04bdaee9da","Type":"ContainerDied","Data":"600487d362561f7285b19c4570a61b7012fa2e3afa74918c8a8b1ed141eb434d"} Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.300559 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.300864 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.300914 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.301488 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.301549 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0" gracePeriod=600 Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.625906 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmbpb" event={"ID":"4aab422a-915a-4fd8-a9f2-3f04bdaee9da","Type":"ContainerStarted","Data":"4d57faecdfb7b40f490d994b932876b0b2f4952bc4540bc81cc9d8019376644e"} Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.629011 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pt48t" event={"ID":"6a074f4f-b7f6-4892-be89-083d619c0771","Type":"ContainerStarted","Data":"cde50ef0d4d8d746fe2ede611feba3c14f0686ebb0c86433f846c96fad238b58"} Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.631732 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b45w" event={"ID":"1cd981c2-7781-4f04-8ef1-73219837a007","Type":"ContainerStarted","Data":"ad0ea257407308ca1a443d52e42bb4ad99adf4118a222e52e37db7536719a888"} Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.634025 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9mrh" event={"ID":"0377ed6d-ea6e-44cb-9d09-0c817af64b22","Type":"ContainerStarted","Data":"a1c818fecba27a888498741fd7e65d970e67862cd2aae8dc7f6cc173bd26621f"} Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.637283 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4drj6" event={"ID":"7c4d6b90-976c-46f4-b55f-26d3277cc754","Type":"ContainerStarted","Data":"94afe569ea782071a7e8325cb527ab1edeb13d943b10205b7ea648d164ca4b31"} Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.639259 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0" exitCode=0 Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.639437 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0"} Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.644820 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rmbpb" podStartSLOduration=4.584280034 podStartE2EDuration="1m11.64480302s" podCreationTimestamp="2026-01-31 09:27:34 +0000 UTC" firstStartedPulling="2026-01-31 09:27:37.951387445 +0000 UTC m=+153.922779432" lastFinishedPulling="2026-01-31 09:28:45.011910431 +0000 UTC m=+220.983302418" observedRunningTime="2026-01-31 09:28:45.644505731 +0000 UTC m=+221.615897738" watchObservedRunningTime="2026-01-31 09:28:45.64480302 +0000 UTC m=+221.616195007" Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.685834 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4drj6" podStartSLOduration=3.492476188 podStartE2EDuration="1m11.685819165s" podCreationTimestamp="2026-01-31 09:27:34 +0000 UTC" firstStartedPulling="2026-01-31 09:27:36.842913405 +0000 UTC m=+152.814305392" lastFinishedPulling="2026-01-31 09:28:45.036256382 +0000 UTC m=+221.007648369" observedRunningTime="2026-01-31 09:28:45.682088084 +0000 UTC m=+221.653480091" watchObservedRunningTime="2026-01-31 09:28:45.685819165 +0000 UTC m=+221.657211152" Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.704919 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j9mrh" podStartSLOduration=3.503939581 podStartE2EDuration="1m11.70490501s" podCreationTimestamp="2026-01-31 09:27:34 +0000 UTC" firstStartedPulling="2026-01-31 09:27:36.930338687 +0000 UTC m=+152.901730674" lastFinishedPulling="2026-01-31 09:28:45.131304116 +0000 UTC m=+221.102696103" observedRunningTime="2026-01-31 09:28:45.702602472 +0000 UTC m=+221.673994479" watchObservedRunningTime="2026-01-31 09:28:45.70490501 +0000 UTC m=+221.676296987" Jan 31 09:28:45 crc kubenswrapper[4992]: I0131 09:28:45.721999 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2b45w" podStartSLOduration=2.721386216 podStartE2EDuration="1m8.721985936s" podCreationTimestamp="2026-01-31 09:27:37 +0000 UTC" firstStartedPulling="2026-01-31 09:27:39.051864052 +0000 UTC m=+155.023256039" lastFinishedPulling="2026-01-31 09:28:45.052463772 +0000 UTC m=+221.023855759" observedRunningTime="2026-01-31 09:28:45.719865623 +0000 UTC m=+221.691257630" watchObservedRunningTime="2026-01-31 09:28:45.721985936 +0000 UTC m=+221.693377913" Jan 31 09:28:47 crc kubenswrapper[4992]: I0131 09:28:47.652520 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"316b87cd3c7723d9291c5891a182a9cf97966bbee250a4d2b5a93c61c18b536c"} Jan 31 09:28:47 crc kubenswrapper[4992]: I0131 09:28:47.655189 4992 generic.go:334] "Generic (PLEG): container finished" podID="6a074f4f-b7f6-4892-be89-083d619c0771" containerID="cde50ef0d4d8d746fe2ede611feba3c14f0686ebb0c86433f846c96fad238b58" exitCode=0 Jan 31 09:28:47 crc kubenswrapper[4992]: I0131 09:28:47.655240 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pt48t" event={"ID":"6a074f4f-b7f6-4892-be89-083d619c0771","Type":"ContainerDied","Data":"cde50ef0d4d8d746fe2ede611feba3c14f0686ebb0c86433f846c96fad238b58"} Jan 31 09:28:47 crc kubenswrapper[4992]: I0131 09:28:47.870930 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:28:47 crc kubenswrapper[4992]: I0131 09:28:47.870987 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:28:47 crc kubenswrapper[4992]: I0131 09:28:47.912896 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:28:48 crc kubenswrapper[4992]: I0131 09:28:48.059713 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:28:48 crc kubenswrapper[4992]: I0131 09:28:48.110565 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:28:50 crc kubenswrapper[4992]: I0131 09:28:50.670123 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pt48t" event={"ID":"6a074f4f-b7f6-4892-be89-083d619c0771","Type":"ContainerStarted","Data":"a4f8b89990b56a8fa2a704b3ef8cd8d6ba4933f46eb46b43b31280261446d177"} Jan 31 09:28:50 crc kubenswrapper[4992]: I0131 09:28:50.675087 4992 generic.go:334] "Generic (PLEG): container finished" podID="533c10ab-faa7-4a62-8e8a-2ebd87578ced" containerID="9a56e528264cae4e293fabbeed8e29d920b65aa95ad6b52b992496e9f9d0be7e" exitCode=0 Jan 31 09:28:50 crc kubenswrapper[4992]: I0131 09:28:50.675518 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7j8l" event={"ID":"533c10ab-faa7-4a62-8e8a-2ebd87578ced","Type":"ContainerDied","Data":"9a56e528264cae4e293fabbeed8e29d920b65aa95ad6b52b992496e9f9d0be7e"} Jan 31 09:28:50 crc kubenswrapper[4992]: I0131 09:28:50.693070 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pt48t" podStartSLOduration=4.286577247 podStartE2EDuration="1m13.693051423s" podCreationTimestamp="2026-01-31 09:27:37 +0000 UTC" firstStartedPulling="2026-01-31 09:27:40.098014984 +0000 UTC m=+156.069406971" lastFinishedPulling="2026-01-31 09:28:49.50448915 +0000 UTC m=+225.475881147" observedRunningTime="2026-01-31 09:28:50.691134777 +0000 UTC m=+226.662526784" watchObservedRunningTime="2026-01-31 09:28:50.693051423 +0000 UTC m=+226.664443410" Jan 31 09:28:51 crc kubenswrapper[4992]: I0131 09:28:51.683139 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7j8l" event={"ID":"533c10ab-faa7-4a62-8e8a-2ebd87578ced","Type":"ContainerStarted","Data":"008c7058149d437aceeb68d97a9b39a578ca8d93a8fc56a613e00c082b93d7e6"} Jan 31 09:28:51 crc kubenswrapper[4992]: I0131 09:28:51.703835 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b7j8l" podStartSLOduration=3.423655125 podStartE2EDuration="1m17.703815441s" podCreationTimestamp="2026-01-31 09:27:34 +0000 UTC" firstStartedPulling="2026-01-31 09:27:36.860252323 +0000 UTC m=+152.831644310" lastFinishedPulling="2026-01-31 09:28:51.140412639 +0000 UTC m=+227.111804626" observedRunningTime="2026-01-31 09:28:51.701806652 +0000 UTC m=+227.673198649" watchObservedRunningTime="2026-01-31 09:28:51.703815441 +0000 UTC m=+227.675207438" Jan 31 09:28:52 crc kubenswrapper[4992]: I0131 09:28:52.392333 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vktdq"] Jan 31 09:28:54 crc kubenswrapper[4992]: I0131 09:28:54.689673 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:28:54 crc kubenswrapper[4992]: I0131 09:28:54.689976 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:28:54 crc kubenswrapper[4992]: I0131 09:28:54.733345 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:28:54 crc kubenswrapper[4992]: I0131 09:28:54.772494 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:28:54 crc kubenswrapper[4992]: I0131 09:28:54.879034 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:28:54 crc kubenswrapper[4992]: I0131 09:28:54.879097 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:28:54 crc kubenswrapper[4992]: I0131 09:28:54.918819 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:28:55 crc kubenswrapper[4992]: I0131 09:28:55.121105 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:28:55 crc kubenswrapper[4992]: I0131 09:28:55.121193 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:28:55 crc kubenswrapper[4992]: I0131 09:28:55.162197 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:28:55 crc kubenswrapper[4992]: I0131 09:28:55.377700 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:28:55 crc kubenswrapper[4992]: I0131 09:28:55.377736 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:28:55 crc kubenswrapper[4992]: I0131 09:28:55.412706 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:28:55 crc kubenswrapper[4992]: I0131 09:28:55.739257 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:28:55 crc kubenswrapper[4992]: I0131 09:28:55.749628 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:28:57 crc kubenswrapper[4992]: I0131 09:28:57.914063 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:28:58 crc kubenswrapper[4992]: I0131 09:28:58.197965 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:28:58 crc kubenswrapper[4992]: I0131 09:28:58.198021 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:28:58 crc kubenswrapper[4992]: I0131 09:28:58.249586 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:28:58 crc kubenswrapper[4992]: I0131 09:28:58.502778 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j9mrh"] Jan 31 09:28:58 crc kubenswrapper[4992]: I0131 09:28:58.503042 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j9mrh" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" containerName="registry-server" containerID="cri-o://a1c818fecba27a888498741fd7e65d970e67862cd2aae8dc7f6cc173bd26621f" gracePeriod=2 Jan 31 09:28:58 crc kubenswrapper[4992]: I0131 09:28:58.758933 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:28:59 crc kubenswrapper[4992]: I0131 09:28:59.090025 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rmbpb"] Jan 31 09:28:59 crc kubenswrapper[4992]: I0131 09:28:59.090584 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rmbpb" podUID="4aab422a-915a-4fd8-a9f2-3f04bdaee9da" containerName="registry-server" containerID="cri-o://4d57faecdfb7b40f490d994b932876b0b2f4952bc4540bc81cc9d8019376644e" gracePeriod=2 Jan 31 09:29:00 crc kubenswrapper[4992]: I0131 09:29:00.900153 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2b45w"] Jan 31 09:29:00 crc kubenswrapper[4992]: I0131 09:29:00.900486 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2b45w" podUID="1cd981c2-7781-4f04-8ef1-73219837a007" containerName="registry-server" containerID="cri-o://ad0ea257407308ca1a443d52e42bb4ad99adf4118a222e52e37db7536719a888" gracePeriod=2 Jan 31 09:29:01 crc kubenswrapper[4992]: I0131 09:29:01.493022 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pt48t"] Jan 31 09:29:01 crc kubenswrapper[4992]: I0131 09:29:01.493258 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pt48t" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" containerName="registry-server" containerID="cri-o://a4f8b89990b56a8fa2a704b3ef8cd8d6ba4933f46eb46b43b31280261446d177" gracePeriod=2 Jan 31 09:29:04 crc kubenswrapper[4992]: I0131 09:29:04.752690 4992 generic.go:334] "Generic (PLEG): container finished" podID="4aab422a-915a-4fd8-a9f2-3f04bdaee9da" containerID="4d57faecdfb7b40f490d994b932876b0b2f4952bc4540bc81cc9d8019376644e" exitCode=0 Jan 31 09:29:04 crc kubenswrapper[4992]: I0131 09:29:04.752758 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmbpb" event={"ID":"4aab422a-915a-4fd8-a9f2-3f04bdaee9da","Type":"ContainerDied","Data":"4d57faecdfb7b40f490d994b932876b0b2f4952bc4540bc81cc9d8019376644e"} Jan 31 09:29:04 crc kubenswrapper[4992]: I0131 09:29:04.917578 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:29:05 crc kubenswrapper[4992]: E0131 09:29:05.121110 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a1c818fecba27a888498741fd7e65d970e67862cd2aae8dc7f6cc173bd26621f is running failed: container process not found" containerID="a1c818fecba27a888498741fd7e65d970e67862cd2aae8dc7f6cc173bd26621f" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:29:05 crc kubenswrapper[4992]: E0131 09:29:05.121657 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a1c818fecba27a888498741fd7e65d970e67862cd2aae8dc7f6cc173bd26621f is running failed: container process not found" containerID="a1c818fecba27a888498741fd7e65d970e67862cd2aae8dc7f6cc173bd26621f" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:29:05 crc kubenswrapper[4992]: E0131 09:29:05.122144 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a1c818fecba27a888498741fd7e65d970e67862cd2aae8dc7f6cc173bd26621f is running failed: container process not found" containerID="a1c818fecba27a888498741fd7e65d970e67862cd2aae8dc7f6cc173bd26621f" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:29:05 crc kubenswrapper[4992]: E0131 09:29:05.122205 4992 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a1c818fecba27a888498741fd7e65d970e67862cd2aae8dc7f6cc173bd26621f is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-j9mrh" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" containerName="registry-server" Jan 31 09:29:05 crc kubenswrapper[4992]: E0131 09:29:05.378812 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4d57faecdfb7b40f490d994b932876b0b2f4952bc4540bc81cc9d8019376644e is running failed: container process not found" containerID="4d57faecdfb7b40f490d994b932876b0b2f4952bc4540bc81cc9d8019376644e" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:29:05 crc kubenswrapper[4992]: E0131 09:29:05.379313 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4d57faecdfb7b40f490d994b932876b0b2f4952bc4540bc81cc9d8019376644e is running failed: container process not found" containerID="4d57faecdfb7b40f490d994b932876b0b2f4952bc4540bc81cc9d8019376644e" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:29:05 crc kubenswrapper[4992]: E0131 09:29:05.379716 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4d57faecdfb7b40f490d994b932876b0b2f4952bc4540bc81cc9d8019376644e is running failed: container process not found" containerID="4d57faecdfb7b40f490d994b932876b0b2f4952bc4540bc81cc9d8019376644e" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:29:05 crc kubenswrapper[4992]: E0131 09:29:05.379793 4992 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4d57faecdfb7b40f490d994b932876b0b2f4952bc4540bc81cc9d8019376644e is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-rmbpb" podUID="4aab422a-915a-4fd8-a9f2-3f04bdaee9da" containerName="registry-server" Jan 31 09:29:06 crc kubenswrapper[4992]: I0131 09:29:06.277262 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:29:06 crc kubenswrapper[4992]: I0131 09:29:06.344197 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vmvw\" (UniqueName: \"kubernetes.io/projected/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-kube-api-access-5vmvw\") pod \"4aab422a-915a-4fd8-a9f2-3f04bdaee9da\" (UID: \"4aab422a-915a-4fd8-a9f2-3f04bdaee9da\") " Jan 31 09:29:06 crc kubenswrapper[4992]: I0131 09:29:06.344287 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-catalog-content\") pod \"4aab422a-915a-4fd8-a9f2-3f04bdaee9da\" (UID: \"4aab422a-915a-4fd8-a9f2-3f04bdaee9da\") " Jan 31 09:29:06 crc kubenswrapper[4992]: I0131 09:29:06.344348 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-utilities\") pod \"4aab422a-915a-4fd8-a9f2-3f04bdaee9da\" (UID: \"4aab422a-915a-4fd8-a9f2-3f04bdaee9da\") " Jan 31 09:29:06 crc kubenswrapper[4992]: I0131 09:29:06.345092 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-utilities" (OuterVolumeSpecName: "utilities") pod "4aab422a-915a-4fd8-a9f2-3f04bdaee9da" (UID: "4aab422a-915a-4fd8-a9f2-3f04bdaee9da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:29:06 crc kubenswrapper[4992]: I0131 09:29:06.351619 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-kube-api-access-5vmvw" (OuterVolumeSpecName: "kube-api-access-5vmvw") pod "4aab422a-915a-4fd8-a9f2-3f04bdaee9da" (UID: "4aab422a-915a-4fd8-a9f2-3f04bdaee9da"). InnerVolumeSpecName "kube-api-access-5vmvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:29:06 crc kubenswrapper[4992]: I0131 09:29:06.400859 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4aab422a-915a-4fd8-a9f2-3f04bdaee9da" (UID: "4aab422a-915a-4fd8-a9f2-3f04bdaee9da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:29:06 crc kubenswrapper[4992]: I0131 09:29:06.446225 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:06 crc kubenswrapper[4992]: I0131 09:29:06.446279 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vmvw\" (UniqueName: \"kubernetes.io/projected/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-kube-api-access-5vmvw\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:06 crc kubenswrapper[4992]: I0131 09:29:06.446303 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aab422a-915a-4fd8-a9f2-3f04bdaee9da-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:06 crc kubenswrapper[4992]: I0131 09:29:06.585916 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-j9mrh_0377ed6d-ea6e-44cb-9d09-0c817af64b22/registry-server/0.log" Jan 31 09:29:06 crc kubenswrapper[4992]: I0131 09:29:06.586812 4992 generic.go:334] "Generic (PLEG): container finished" podID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" containerID="a1c818fecba27a888498741fd7e65d970e67862cd2aae8dc7f6cc173bd26621f" exitCode=137 Jan 31 09:29:06 crc kubenswrapper[4992]: I0131 09:29:06.586858 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9mrh" event={"ID":"0377ed6d-ea6e-44cb-9d09-0c817af64b22","Type":"ContainerDied","Data":"a1c818fecba27a888498741fd7e65d970e67862cd2aae8dc7f6cc173bd26621f"} Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.282169 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-j9mrh_0377ed6d-ea6e-44cb-9d09-0c817af64b22/registry-server/0.log" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.282835 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:29:07 crc kubenswrapper[4992]: E0131 09:29:07.344219 4992 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aab422a_915a_4fd8_a9f2_3f04bdaee9da.slice\": RecentStats: unable to find data in memory cache]" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.358604 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0377ed6d-ea6e-44cb-9d09-0c817af64b22-catalog-content\") pod \"0377ed6d-ea6e-44cb-9d09-0c817af64b22\" (UID: \"0377ed6d-ea6e-44cb-9d09-0c817af64b22\") " Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.358670 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctmrx\" (UniqueName: \"kubernetes.io/projected/0377ed6d-ea6e-44cb-9d09-0c817af64b22-kube-api-access-ctmrx\") pod \"0377ed6d-ea6e-44cb-9d09-0c817af64b22\" (UID: \"0377ed6d-ea6e-44cb-9d09-0c817af64b22\") " Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.358765 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0377ed6d-ea6e-44cb-9d09-0c817af64b22-utilities\") pod \"0377ed6d-ea6e-44cb-9d09-0c817af64b22\" (UID: \"0377ed6d-ea6e-44cb-9d09-0c817af64b22\") " Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.359579 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0377ed6d-ea6e-44cb-9d09-0c817af64b22-utilities" (OuterVolumeSpecName: "utilities") pod "0377ed6d-ea6e-44cb-9d09-0c817af64b22" (UID: "0377ed6d-ea6e-44cb-9d09-0c817af64b22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.399161 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0377ed6d-ea6e-44cb-9d09-0c817af64b22-kube-api-access-ctmrx" (OuterVolumeSpecName: "kube-api-access-ctmrx") pod "0377ed6d-ea6e-44cb-9d09-0c817af64b22" (UID: "0377ed6d-ea6e-44cb-9d09-0c817af64b22"). InnerVolumeSpecName "kube-api-access-ctmrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.460587 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctmrx\" (UniqueName: \"kubernetes.io/projected/0377ed6d-ea6e-44cb-9d09-0c817af64b22-kube-api-access-ctmrx\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.460643 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0377ed6d-ea6e-44cb-9d09-0c817af64b22-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.484743 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.561840 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr2jx\" (UniqueName: \"kubernetes.io/projected/1cd981c2-7781-4f04-8ef1-73219837a007-kube-api-access-zr2jx\") pod \"1cd981c2-7781-4f04-8ef1-73219837a007\" (UID: \"1cd981c2-7781-4f04-8ef1-73219837a007\") " Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.562191 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd981c2-7781-4f04-8ef1-73219837a007-catalog-content\") pod \"1cd981c2-7781-4f04-8ef1-73219837a007\" (UID: \"1cd981c2-7781-4f04-8ef1-73219837a007\") " Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.562315 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd981c2-7781-4f04-8ef1-73219837a007-utilities\") pod \"1cd981c2-7781-4f04-8ef1-73219837a007\" (UID: \"1cd981c2-7781-4f04-8ef1-73219837a007\") " Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.563206 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cd981c2-7781-4f04-8ef1-73219837a007-utilities" (OuterVolumeSpecName: "utilities") pod "1cd981c2-7781-4f04-8ef1-73219837a007" (UID: "1cd981c2-7781-4f04-8ef1-73219837a007"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.565377 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cd981c2-7781-4f04-8ef1-73219837a007-kube-api-access-zr2jx" (OuterVolumeSpecName: "kube-api-access-zr2jx") pod "1cd981c2-7781-4f04-8ef1-73219837a007" (UID: "1cd981c2-7781-4f04-8ef1-73219837a007"). InnerVolumeSpecName "kube-api-access-zr2jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.598250 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-j9mrh_0377ed6d-ea6e-44cb-9d09-0c817af64b22/registry-server/0.log" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.599476 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j9mrh" event={"ID":"0377ed6d-ea6e-44cb-9d09-0c817af64b22","Type":"ContainerDied","Data":"26250c8d497c57eb4891e22fe2d61a35a354952bf3aac1f5cebc5a9af161fe34"} Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.599544 4992 scope.go:117] "RemoveContainer" containerID="a1c818fecba27a888498741fd7e65d970e67862cd2aae8dc7f6cc173bd26621f" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.599790 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j9mrh" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.604937 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rmbpb" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.605025 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rmbpb" event={"ID":"4aab422a-915a-4fd8-a9f2-3f04bdaee9da","Type":"ContainerDied","Data":"844bc7a9bb1ad388abca2faa9adb6f39f8f2b97274519ed7599b855bddf0557e"} Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.608148 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cd981c2-7781-4f04-8ef1-73219837a007-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1cd981c2-7781-4f04-8ef1-73219837a007" (UID: "1cd981c2-7781-4f04-8ef1-73219837a007"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.614644 4992 generic.go:334] "Generic (PLEG): container finished" podID="6a074f4f-b7f6-4892-be89-083d619c0771" containerID="a4f8b89990b56a8fa2a704b3ef8cd8d6ba4933f46eb46b43b31280261446d177" exitCode=0 Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.614726 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pt48t" event={"ID":"6a074f4f-b7f6-4892-be89-083d619c0771","Type":"ContainerDied","Data":"a4f8b89990b56a8fa2a704b3ef8cd8d6ba4933f46eb46b43b31280261446d177"} Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.621681 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rmbpb"] Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.622801 4992 generic.go:334] "Generic (PLEG): container finished" podID="1cd981c2-7781-4f04-8ef1-73219837a007" containerID="ad0ea257407308ca1a443d52e42bb4ad99adf4118a222e52e37db7536719a888" exitCode=0 Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.622878 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b45w" event={"ID":"1cd981c2-7781-4f04-8ef1-73219837a007","Type":"ContainerDied","Data":"ad0ea257407308ca1a443d52e42bb4ad99adf4118a222e52e37db7536719a888"} Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.622913 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2b45w" event={"ID":"1cd981c2-7781-4f04-8ef1-73219837a007","Type":"ContainerDied","Data":"047d161dea7959b68e66d8e8d18d8142731e900978747ee23ffbccdc51e30a3b"} Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.622912 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2b45w" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.626558 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rmbpb"] Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.648575 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2b45w"] Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.650755 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2b45w"] Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.664304 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cd981c2-7781-4f04-8ef1-73219837a007-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.664331 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cd981c2-7781-4f04-8ef1-73219837a007-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:07 crc kubenswrapper[4992]: I0131 09:29:07.664342 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr2jx\" (UniqueName: \"kubernetes.io/projected/1cd981c2-7781-4f04-8ef1-73219837a007-kube-api-access-zr2jx\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.197471 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a4f8b89990b56a8fa2a704b3ef8cd8d6ba4933f46eb46b43b31280261446d177 is running failed: container process not found" containerID="a4f8b89990b56a8fa2a704b3ef8cd8d6ba4933f46eb46b43b31280261446d177" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.197978 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a4f8b89990b56a8fa2a704b3ef8cd8d6ba4933f46eb46b43b31280261446d177 is running failed: container process not found" containerID="a4f8b89990b56a8fa2a704b3ef8cd8d6ba4933f46eb46b43b31280261446d177" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.198468 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a4f8b89990b56a8fa2a704b3ef8cd8d6ba4933f46eb46b43b31280261446d177 is running failed: container process not found" containerID="a4f8b89990b56a8fa2a704b3ef8cd8d6ba4933f46eb46b43b31280261446d177" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.198555 4992 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a4f8b89990b56a8fa2a704b3ef8cd8d6ba4933f46eb46b43b31280261446d177 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-pt48t" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" containerName="registry-server" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.337941 4992 scope.go:117] "RemoveContainer" containerID="033da07aacf015cb005c994f281fc65c556eb398d9ca3fd91793e31f5be0ab7e" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.355649 4992 scope.go:117] "RemoveContainer" containerID="4ae68a2eae1d3a32859991118e6bc242e7005a525e112e1b7d55e956a7a03051" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.386275 4992 scope.go:117] "RemoveContainer" containerID="4d57faecdfb7b40f490d994b932876b0b2f4952bc4540bc81cc9d8019376644e" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.441095 4992 scope.go:117] "RemoveContainer" containerID="600487d362561f7285b19c4570a61b7012fa2e3afa74918c8a8b1ed141eb434d" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.455087 4992 scope.go:117] "RemoveContainer" containerID="39fdb6bc86da38fb7399f8a5b5e925013e96c8b541d073d0ec07a059f33a0a99" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.482933 4992 scope.go:117] "RemoveContainer" containerID="ad0ea257407308ca1a443d52e42bb4ad99adf4118a222e52e37db7536719a888" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583041 4992 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.583240 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aab422a-915a-4fd8-a9f2-3f04bdaee9da" containerName="extract-content" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583251 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aab422a-915a-4fd8-a9f2-3f04bdaee9da" containerName="extract-content" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.583262 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aab422a-915a-4fd8-a9f2-3f04bdaee9da" containerName="extract-utilities" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583269 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aab422a-915a-4fd8-a9f2-3f04bdaee9da" containerName="extract-utilities" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.583275 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="865f17ae-618a-4f0a-b79b-3da46e3ea9c9" containerName="pruner" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583282 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="865f17ae-618a-4f0a-b79b-3da46e3ea9c9" containerName="pruner" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.583290 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd981c2-7781-4f04-8ef1-73219837a007" containerName="extract-utilities" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583296 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd981c2-7781-4f04-8ef1-73219837a007" containerName="extract-utilities" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.583307 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" containerName="registry-server" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583314 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" containerName="registry-server" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.583322 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd981c2-7781-4f04-8ef1-73219837a007" containerName="extract-content" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583330 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd981c2-7781-4f04-8ef1-73219837a007" containerName="extract-content" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.583337 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aab422a-915a-4fd8-a9f2-3f04bdaee9da" containerName="registry-server" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583343 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aab422a-915a-4fd8-a9f2-3f04bdaee9da" containerName="registry-server" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.583353 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" containerName="extract-content" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583358 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" containerName="extract-content" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.583367 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd981c2-7781-4f04-8ef1-73219837a007" containerName="registry-server" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583372 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd981c2-7781-4f04-8ef1-73219837a007" containerName="registry-server" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.583383 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" containerName="extract-utilities" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583391 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" containerName="extract-utilities" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583512 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" containerName="registry-server" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583527 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aab422a-915a-4fd8-a9f2-3f04bdaee9da" containerName="registry-server" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583543 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="865f17ae-618a-4f0a-b79b-3da46e3ea9c9" containerName="pruner" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583553 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cd981c2-7781-4f04-8ef1-73219837a007" containerName="registry-server" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.583934 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.584358 4992 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.584861 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3" gracePeriod=15 Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.584891 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1" gracePeriod=15 Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.584939 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183" gracePeriod=15 Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.584974 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7" gracePeriod=15 Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.584999 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5" gracePeriod=15 Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.586823 4992 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.587207 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.587245 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.587275 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.587292 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.587314 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.587331 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.587363 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.587381 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.587406 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.587524 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.587556 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.587572 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.587609 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.587625 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.587914 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.587950 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.588095 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.588121 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.588150 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.597958 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:29:08 crc kubenswrapper[4992]: E0131 09:29:08.649031 4992 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.243:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.675749 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.675836 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.675886 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.675928 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.676010 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.676092 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.676133 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.676159 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.777237 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.777366 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.777448 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.777660 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.777775 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.777988 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.778110 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.778217 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.778322 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.778057 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.777563 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.777830 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.778258 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.778285 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.777710 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.778395 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:08 crc kubenswrapper[4992]: I0131 09:29:08.950412 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:09 crc kubenswrapper[4992]: I0131 09:29:09.191133 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cd981c2-7781-4f04-8ef1-73219837a007" path="/var/lib/kubelet/pods/1cd981c2-7781-4f04-8ef1-73219837a007/volumes" Jan 31 09:29:09 crc kubenswrapper[4992]: I0131 09:29:09.192589 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aab422a-915a-4fd8-a9f2-3f04bdaee9da" path="/var/lib/kubelet/pods/4aab422a-915a-4fd8-a9f2-3f04bdaee9da/volumes" Jan 31 09:29:09 crc kubenswrapper[4992]: I0131 09:29:09.386948 4992 scope.go:117] "RemoveContainer" containerID="9ffc73f221bea9766c7d0c8c18e4779d4aff641a63629defaff3c01a4666966c" Jan 31 09:29:09 crc kubenswrapper[4992]: E0131 09:29:09.450783 4992 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.243:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188fc6c9ff15eac0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 09:29:09.450246848 +0000 UTC m=+245.421638835,LastTimestamp:2026-01-31 09:29:09.450246848 +0000 UTC m=+245.421638835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 09:29:09 crc kubenswrapper[4992]: I0131 09:29:09.522328 4992 scope.go:117] "RemoveContainer" containerID="4268d54c718ce2d4e4fef85848dc80765aea64de81684b205624d59fe2b59848" Jan 31 09:29:09 crc kubenswrapper[4992]: I0131 09:29:09.545723 4992 scope.go:117] "RemoveContainer" containerID="ad0ea257407308ca1a443d52e42bb4ad99adf4118a222e52e37db7536719a888" Jan 31 09:29:09 crc kubenswrapper[4992]: E0131 09:29:09.546215 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad0ea257407308ca1a443d52e42bb4ad99adf4118a222e52e37db7536719a888\": container with ID starting with ad0ea257407308ca1a443d52e42bb4ad99adf4118a222e52e37db7536719a888 not found: ID does not exist" containerID="ad0ea257407308ca1a443d52e42bb4ad99adf4118a222e52e37db7536719a888" Jan 31 09:29:09 crc kubenswrapper[4992]: I0131 09:29:09.546248 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0ea257407308ca1a443d52e42bb4ad99adf4118a222e52e37db7536719a888"} err="failed to get container status \"ad0ea257407308ca1a443d52e42bb4ad99adf4118a222e52e37db7536719a888\": rpc error: code = NotFound desc = could not find container \"ad0ea257407308ca1a443d52e42bb4ad99adf4118a222e52e37db7536719a888\": container with ID starting with ad0ea257407308ca1a443d52e42bb4ad99adf4118a222e52e37db7536719a888 not found: ID does not exist" Jan 31 09:29:09 crc kubenswrapper[4992]: I0131 09:29:09.546293 4992 scope.go:117] "RemoveContainer" containerID="9ffc73f221bea9766c7d0c8c18e4779d4aff641a63629defaff3c01a4666966c" Jan 31 09:29:09 crc kubenswrapper[4992]: E0131 09:29:09.547024 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ffc73f221bea9766c7d0c8c18e4779d4aff641a63629defaff3c01a4666966c\": container with ID starting with 9ffc73f221bea9766c7d0c8c18e4779d4aff641a63629defaff3c01a4666966c not found: ID does not exist" containerID="9ffc73f221bea9766c7d0c8c18e4779d4aff641a63629defaff3c01a4666966c" Jan 31 09:29:09 crc kubenswrapper[4992]: I0131 09:29:09.547067 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ffc73f221bea9766c7d0c8c18e4779d4aff641a63629defaff3c01a4666966c"} err="failed to get container status \"9ffc73f221bea9766c7d0c8c18e4779d4aff641a63629defaff3c01a4666966c\": rpc error: code = NotFound desc = could not find container \"9ffc73f221bea9766c7d0c8c18e4779d4aff641a63629defaff3c01a4666966c\": container with ID starting with 9ffc73f221bea9766c7d0c8c18e4779d4aff641a63629defaff3c01a4666966c not found: ID does not exist" Jan 31 09:29:09 crc kubenswrapper[4992]: I0131 09:29:09.547084 4992 scope.go:117] "RemoveContainer" containerID="4268d54c718ce2d4e4fef85848dc80765aea64de81684b205624d59fe2b59848" Jan 31 09:29:09 crc kubenswrapper[4992]: E0131 09:29:09.547623 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4268d54c718ce2d4e4fef85848dc80765aea64de81684b205624d59fe2b59848\": container with ID starting with 4268d54c718ce2d4e4fef85848dc80765aea64de81684b205624d59fe2b59848 not found: ID does not exist" containerID="4268d54c718ce2d4e4fef85848dc80765aea64de81684b205624d59fe2b59848" Jan 31 09:29:09 crc kubenswrapper[4992]: I0131 09:29:09.547668 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4268d54c718ce2d4e4fef85848dc80765aea64de81684b205624d59fe2b59848"} err="failed to get container status \"4268d54c718ce2d4e4fef85848dc80765aea64de81684b205624d59fe2b59848\": rpc error: code = NotFound desc = could not find container \"4268d54c718ce2d4e4fef85848dc80765aea64de81684b205624d59fe2b59848\": container with ID starting with 4268d54c718ce2d4e4fef85848dc80765aea64de81684b205624d59fe2b59848 not found: ID does not exist" Jan 31 09:29:09 crc kubenswrapper[4992]: I0131 09:29:09.645520 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b9c0f03f731d1cba9b53662f2643e0d4360cd855eeb06d3aed2561b10acbcf63"} Jan 31 09:29:10 crc kubenswrapper[4992]: I0131 09:29:10.136236 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:29:10 crc kubenswrapper[4992]: I0131 09:29:10.137123 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:10 crc kubenswrapper[4992]: I0131 09:29:10.203253 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a074f4f-b7f6-4892-be89-083d619c0771-catalog-content\") pod \"6a074f4f-b7f6-4892-be89-083d619c0771\" (UID: \"6a074f4f-b7f6-4892-be89-083d619c0771\") " Jan 31 09:29:10 crc kubenswrapper[4992]: I0131 09:29:10.203400 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzp28\" (UniqueName: \"kubernetes.io/projected/6a074f4f-b7f6-4892-be89-083d619c0771-kube-api-access-kzp28\") pod \"6a074f4f-b7f6-4892-be89-083d619c0771\" (UID: \"6a074f4f-b7f6-4892-be89-083d619c0771\") " Jan 31 09:29:10 crc kubenswrapper[4992]: I0131 09:29:10.203484 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a074f4f-b7f6-4892-be89-083d619c0771-utilities\") pod \"6a074f4f-b7f6-4892-be89-083d619c0771\" (UID: \"6a074f4f-b7f6-4892-be89-083d619c0771\") " Jan 31 09:29:10 crc kubenswrapper[4992]: I0131 09:29:10.204315 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a074f4f-b7f6-4892-be89-083d619c0771-utilities" (OuterVolumeSpecName: "utilities") pod "6a074f4f-b7f6-4892-be89-083d619c0771" (UID: "6a074f4f-b7f6-4892-be89-083d619c0771"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:29:10 crc kubenswrapper[4992]: I0131 09:29:10.208183 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a074f4f-b7f6-4892-be89-083d619c0771-kube-api-access-kzp28" (OuterVolumeSpecName: "kube-api-access-kzp28") pod "6a074f4f-b7f6-4892-be89-083d619c0771" (UID: "6a074f4f-b7f6-4892-be89-083d619c0771"). InnerVolumeSpecName "kube-api-access-kzp28". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:29:10 crc kubenswrapper[4992]: I0131 09:29:10.305041 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzp28\" (UniqueName: \"kubernetes.io/projected/6a074f4f-b7f6-4892-be89-083d619c0771-kube-api-access-kzp28\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:10 crc kubenswrapper[4992]: I0131 09:29:10.305086 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a074f4f-b7f6-4892-be89-083d619c0771-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:10 crc kubenswrapper[4992]: I0131 09:29:10.736866 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pt48t" event={"ID":"6a074f4f-b7f6-4892-be89-083d619c0771","Type":"ContainerDied","Data":"a74e41a32613e9e1a744d07ea44b6bac4fa6921d009f8c0df1074214a86d0e9b"} Jan 31 09:29:10 crc kubenswrapper[4992]: I0131 09:29:10.737848 4992 scope.go:117] "RemoveContainer" containerID="a4f8b89990b56a8fa2a704b3ef8cd8d6ba4933f46eb46b43b31280261446d177" Jan 31 09:29:10 crc kubenswrapper[4992]: I0131 09:29:10.737489 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pt48t" Jan 31 09:29:10 crc kubenswrapper[4992]: I0131 09:29:10.739160 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:10 crc kubenswrapper[4992]: I0131 09:29:10.786674 4992 scope.go:117] "RemoveContainer" containerID="cde50ef0d4d8d746fe2ede611feba3c14f0686ebb0c86433f846c96fad238b58" Jan 31 09:29:10 crc kubenswrapper[4992]: I0131 09:29:10.852715 4992 scope.go:117] "RemoveContainer" containerID="702ae5728dd8474c0c89693eef5d925a36bae8457fd4f04f05b88ade995e7e9b" Jan 31 09:29:11 crc kubenswrapper[4992]: I0131 09:29:11.744596 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 31 09:29:11 crc kubenswrapper[4992]: I0131 09:29:11.747891 4992 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1" exitCode=0 Jan 31 09:29:11 crc kubenswrapper[4992]: I0131 09:29:11.747976 4992 scope.go:117] "RemoveContainer" containerID="a9d3709eb633a04c3ea6d05ecb76612d84282e8f3a761ff0522767a7ee490665" Jan 31 09:29:12 crc kubenswrapper[4992]: E0131 09:29:12.275117 4992 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.243:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188fc6c9ff15eac0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 09:29:09.450246848 +0000 UTC m=+245.421638835,LastTimestamp:2026-01-31 09:29:09.450246848 +0000 UTC m=+245.421638835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 09:29:12 crc kubenswrapper[4992]: I0131 09:29:12.652937 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a074f4f-b7f6-4892-be89-083d619c0771-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a074f4f-b7f6-4892-be89-083d619c0771" (UID: "6a074f4f-b7f6-4892-be89-083d619c0771"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:29:12 crc kubenswrapper[4992]: I0131 09:29:12.676169 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a074f4f-b7f6-4892-be89-083d619c0771-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:12 crc kubenswrapper[4992]: I0131 09:29:12.755630 4992 generic.go:334] "Generic (PLEG): container finished" podID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" containerID="a3d65764cc5a6d7733838c0524a39001b6f17f244be09f93dbef1134c64a8e62" exitCode=0 Jan 31 09:29:12 crc kubenswrapper[4992]: I0131 09:29:12.755742 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0","Type":"ContainerDied","Data":"a3d65764cc5a6d7733838c0524a39001b6f17f244be09f93dbef1134c64a8e62"} Jan 31 09:29:12 crc kubenswrapper[4992]: I0131 09:29:12.756679 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:12 crc kubenswrapper[4992]: I0131 09:29:12.757467 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:12 crc kubenswrapper[4992]: I0131 09:29:12.761086 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 09:29:12 crc kubenswrapper[4992]: I0131 09:29:12.762190 4992 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5" exitCode=0 Jan 31 09:29:12 crc kubenswrapper[4992]: I0131 09:29:12.762330 4992 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7" exitCode=2 Jan 31 09:29:12 crc kubenswrapper[4992]: I0131 09:29:12.857139 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:12 crc kubenswrapper[4992]: I0131 09:29:12.857634 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:13 crc kubenswrapper[4992]: I0131 09:29:13.772530 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 09:29:13 crc kubenswrapper[4992]: I0131 09:29:13.773486 4992 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183" exitCode=0 Jan 31 09:29:13 crc kubenswrapper[4992]: I0131 09:29:13.775984 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2290a3bfe92f649cc23de6ee848d4fbadc0338a6382f4f92ea6442197a765a89"} Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.050632 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.051120 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.051337 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.196606 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-var-lock\") pod \"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0\" (UID: \"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0\") " Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.196752 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-kube-api-access\") pod \"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0\" (UID: \"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0\") " Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.196800 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-kubelet-dir\") pod \"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0\" (UID: \"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0\") " Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.197022 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" (UID: "c7fa5f9b-a33f-4527-970e-c7912dbdc9f0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.197022 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-var-lock" (OuterVolumeSpecName: "var-lock") pod "c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" (UID: "c7fa5f9b-a33f-4527-970e-c7912dbdc9f0"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.203567 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" (UID: "c7fa5f9b-a33f-4527-970e-c7912dbdc9f0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.298167 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.298202 4992 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.298214 4992 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c7fa5f9b-a33f-4527-970e-c7912dbdc9f0-var-lock\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.782698 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c7fa5f9b-a33f-4527-970e-c7912dbdc9f0","Type":"ContainerDied","Data":"d7f9b391f32543f4fc00569eaf464944b74b25862ae5fff6249d24aa6954b056"} Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.782742 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7f9b391f32543f4fc00569eaf464944b74b25862ae5fff6249d24aa6954b056" Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.782752 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.785582 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.786287 4992 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3" exitCode=0 Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.804123 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:14 crc kubenswrapper[4992]: I0131 09:29:14.804471 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.187577 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.188038 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.624924 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0377ed6d-ea6e-44cb-9d09-0c817af64b22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0377ed6d-ea6e-44cb-9d09-0c817af64b22" (UID: "0377ed6d-ea6e-44cb-9d09-0c817af64b22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.714039 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0377ed6d-ea6e-44cb-9d09-0c817af64b22-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.714070 4992 status_manager.go:851] "Failed to get status for pod" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" pod="openshift-marketplace/certified-operators-j9mrh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-j9mrh\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.714669 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.715003 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.785110 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.785944 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.786702 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.787198 4992 status_manager.go:851] "Failed to get status for pod" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" pod="openshift-marketplace/certified-operators-j9mrh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-j9mrh\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.787534 4992 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.787749 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.795221 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.796242 4992 scope.go:117] "RemoveContainer" containerID="fecccc4d05fc4b7d627cc1d8b96d2bbadb2baaf5de85d61672d76de7f6341bf1" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.796470 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.796953 4992 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:15 crc kubenswrapper[4992]: E0131 09:29:15.797380 4992 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.243:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.797535 4992 status_manager.go:851] "Failed to get status for pod" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" pod="openshift-marketplace/certified-operators-j9mrh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-j9mrh\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.797955 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.798334 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.812532 4992 scope.go:117] "RemoveContainer" containerID="62751f1d1f1dcc33da15cf582c0ced64b6c4c38a7f77aa1f43c34aa8fefe3da5" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.832927 4992 scope.go:117] "RemoveContainer" containerID="b513ba029c971003d6034395e57c66baee10e819529061ad6a5783abd34a7183" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.853457 4992 scope.go:117] "RemoveContainer" containerID="6bd1c12081910f7de6a55ff948387615c89eb8ffaf5943b4542c4d7ecc2fb9e7" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.871166 4992 scope.go:117] "RemoveContainer" containerID="6c0d8485c1ff1a729652b1e25e64c8da2a4a389e037ab7f7a41890c7176ad1a3" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.887203 4992 scope.go:117] "RemoveContainer" containerID="ede030ca5257587d3b280dd412bbf496e4781d4b258820c111d1dc65d2a9b2c8" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.916515 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.916681 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.916707 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.916860 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.916952 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.916983 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.917996 4992 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.918031 4992 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:15 crc kubenswrapper[4992]: I0131 09:29:15.918049 4992 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:16 crc kubenswrapper[4992]: I0131 09:29:16.119035 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:16 crc kubenswrapper[4992]: I0131 09:29:16.119743 4992 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:16 crc kubenswrapper[4992]: I0131 09:29:16.120333 4992 status_manager.go:851] "Failed to get status for pod" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" pod="openshift-marketplace/certified-operators-j9mrh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-j9mrh\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:16 crc kubenswrapper[4992]: I0131 09:29:16.120832 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:16 crc kubenswrapper[4992]: E0131 09:29:16.284939 4992 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:16 crc kubenswrapper[4992]: E0131 09:29:16.285768 4992 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:16 crc kubenswrapper[4992]: E0131 09:29:16.286498 4992 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:16 crc kubenswrapper[4992]: E0131 09:29:16.287029 4992 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:16 crc kubenswrapper[4992]: E0131 09:29:16.287632 4992 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:16 crc kubenswrapper[4992]: I0131 09:29:16.287711 4992 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 31 09:29:16 crc kubenswrapper[4992]: E0131 09:29:16.288120 4992 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" interval="200ms" Jan 31 09:29:16 crc kubenswrapper[4992]: E0131 09:29:16.489217 4992 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" interval="400ms" Jan 31 09:29:16 crc kubenswrapper[4992]: E0131 09:29:16.829045 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:29:16Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:29:16Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:29:16Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:29:16Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:16 crc kubenswrapper[4992]: E0131 09:29:16.829460 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:16 crc kubenswrapper[4992]: E0131 09:29:16.829844 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:16 crc kubenswrapper[4992]: E0131 09:29:16.830061 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:16 crc kubenswrapper[4992]: E0131 09:29:16.830305 4992 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:16 crc kubenswrapper[4992]: E0131 09:29:16.830327 4992 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:29:16 crc kubenswrapper[4992]: E0131 09:29:16.890497 4992 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" interval="800ms" Jan 31 09:29:17 crc kubenswrapper[4992]: I0131 09:29:17.189437 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 31 09:29:17 crc kubenswrapper[4992]: I0131 09:29:17.421871 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" containerName="oauth-openshift" containerID="cri-o://be4d9b0214643318f15628b89023e7e6feced132f4eae4a71b8e1444baf68b8a" gracePeriod=15 Jan 31 09:29:17 crc kubenswrapper[4992]: E0131 09:29:17.691621 4992 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" interval="1.6s" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.303269 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.304020 4992 status_manager.go:851] "Failed to get status for pod" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-vktdq\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.304311 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.304532 4992 status_manager.go:851] "Failed to get status for pod" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" pod="openshift-marketplace/certified-operators-j9mrh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-j9mrh\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.304758 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.448223 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-error\") pod \"db3860c3-37de-4fa5-9c79-965abd0e2149\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.448267 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-session\") pod \"db3860c3-37de-4fa5-9c79-965abd0e2149\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.448295 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-serving-cert\") pod \"db3860c3-37de-4fa5-9c79-965abd0e2149\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.448322 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-cliconfig\") pod \"db3860c3-37de-4fa5-9c79-965abd0e2149\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.448351 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-ocp-branding-template\") pod \"db3860c3-37de-4fa5-9c79-965abd0e2149\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.448379 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-service-ca\") pod \"db3860c3-37de-4fa5-9c79-965abd0e2149\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.448394 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsnbx\" (UniqueName: \"kubernetes.io/projected/db3860c3-37de-4fa5-9c79-965abd0e2149-kube-api-access-rsnbx\") pod \"db3860c3-37de-4fa5-9c79-965abd0e2149\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.448445 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-login\") pod \"db3860c3-37de-4fa5-9c79-965abd0e2149\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.448487 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-router-certs\") pod \"db3860c3-37de-4fa5-9c79-965abd0e2149\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.448525 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-provider-selection\") pod \"db3860c3-37de-4fa5-9c79-965abd0e2149\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.448545 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db3860c3-37de-4fa5-9c79-965abd0e2149-audit-dir\") pod \"db3860c3-37de-4fa5-9c79-965abd0e2149\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.448571 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-idp-0-file-data\") pod \"db3860c3-37de-4fa5-9c79-965abd0e2149\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.448608 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-trusted-ca-bundle\") pod \"db3860c3-37de-4fa5-9c79-965abd0e2149\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.448647 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-audit-policies\") pod \"db3860c3-37de-4fa5-9c79-965abd0e2149\" (UID: \"db3860c3-37de-4fa5-9c79-965abd0e2149\") " Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.450007 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "db3860c3-37de-4fa5-9c79-965abd0e2149" (UID: "db3860c3-37de-4fa5-9c79-965abd0e2149"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.451895 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "db3860c3-37de-4fa5-9c79-965abd0e2149" (UID: "db3860c3-37de-4fa5-9c79-965abd0e2149"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.451949 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db3860c3-37de-4fa5-9c79-965abd0e2149-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "db3860c3-37de-4fa5-9c79-965abd0e2149" (UID: "db3860c3-37de-4fa5-9c79-965abd0e2149"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.452206 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "db3860c3-37de-4fa5-9c79-965abd0e2149" (UID: "db3860c3-37de-4fa5-9c79-965abd0e2149"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.453489 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "db3860c3-37de-4fa5-9c79-965abd0e2149" (UID: "db3860c3-37de-4fa5-9c79-965abd0e2149"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.453691 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "db3860c3-37de-4fa5-9c79-965abd0e2149" (UID: "db3860c3-37de-4fa5-9c79-965abd0e2149"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.454089 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "db3860c3-37de-4fa5-9c79-965abd0e2149" (UID: "db3860c3-37de-4fa5-9c79-965abd0e2149"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.454466 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "db3860c3-37de-4fa5-9c79-965abd0e2149" (UID: "db3860c3-37de-4fa5-9c79-965abd0e2149"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.455462 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "db3860c3-37de-4fa5-9c79-965abd0e2149" (UID: "db3860c3-37de-4fa5-9c79-965abd0e2149"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.455688 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "db3860c3-37de-4fa5-9c79-965abd0e2149" (UID: "db3860c3-37de-4fa5-9c79-965abd0e2149"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.455975 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db3860c3-37de-4fa5-9c79-965abd0e2149-kube-api-access-rsnbx" (OuterVolumeSpecName: "kube-api-access-rsnbx") pod "db3860c3-37de-4fa5-9c79-965abd0e2149" (UID: "db3860c3-37de-4fa5-9c79-965abd0e2149"). InnerVolumeSpecName "kube-api-access-rsnbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.489410 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "db3860c3-37de-4fa5-9c79-965abd0e2149" (UID: "db3860c3-37de-4fa5-9c79-965abd0e2149"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.490100 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "db3860c3-37de-4fa5-9c79-965abd0e2149" (UID: "db3860c3-37de-4fa5-9c79-965abd0e2149"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.490658 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "db3860c3-37de-4fa5-9c79-965abd0e2149" (UID: "db3860c3-37de-4fa5-9c79-965abd0e2149"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.550040 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.550281 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsnbx\" (UniqueName: \"kubernetes.io/projected/db3860c3-37de-4fa5-9c79-965abd0e2149-kube-api-access-rsnbx\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.550558 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.550692 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.550772 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.550852 4992 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/db3860c3-37de-4fa5-9c79-965abd0e2149-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.550929 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.551055 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.551149 4992 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.551233 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.551339 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.551638 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.551707 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.551785 4992 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/db3860c3-37de-4fa5-9c79-965abd0e2149-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.815456 4992 generic.go:334] "Generic (PLEG): container finished" podID="db3860c3-37de-4fa5-9c79-965abd0e2149" containerID="be4d9b0214643318f15628b89023e7e6feced132f4eae4a71b8e1444baf68b8a" exitCode=0 Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.815581 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.815576 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" event={"ID":"db3860c3-37de-4fa5-9c79-965abd0e2149","Type":"ContainerDied","Data":"be4d9b0214643318f15628b89023e7e6feced132f4eae4a71b8e1444baf68b8a"} Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.815944 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" event={"ID":"db3860c3-37de-4fa5-9c79-965abd0e2149","Type":"ContainerDied","Data":"beccbd9b3c22c35e049051905fc1b8b6b84891beeb328fe8e26316935316466f"} Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.815970 4992 scope.go:117] "RemoveContainer" containerID="be4d9b0214643318f15628b89023e7e6feced132f4eae4a71b8e1444baf68b8a" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.816614 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.816974 4992 status_manager.go:851] "Failed to get status for pod" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-vktdq\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.817558 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.817916 4992 status_manager.go:851] "Failed to get status for pod" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" pod="openshift-marketplace/certified-operators-j9mrh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-j9mrh\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.831585 4992 status_manager.go:851] "Failed to get status for pod" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-vktdq\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.832374 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.832737 4992 status_manager.go:851] "Failed to get status for pod" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" pod="openshift-marketplace/certified-operators-j9mrh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-j9mrh\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.833090 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.835744 4992 scope.go:117] "RemoveContainer" containerID="be4d9b0214643318f15628b89023e7e6feced132f4eae4a71b8e1444baf68b8a" Jan 31 09:29:18 crc kubenswrapper[4992]: E0131 09:29:18.836107 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be4d9b0214643318f15628b89023e7e6feced132f4eae4a71b8e1444baf68b8a\": container with ID starting with be4d9b0214643318f15628b89023e7e6feced132f4eae4a71b8e1444baf68b8a not found: ID does not exist" containerID="be4d9b0214643318f15628b89023e7e6feced132f4eae4a71b8e1444baf68b8a" Jan 31 09:29:18 crc kubenswrapper[4992]: I0131 09:29:18.836146 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be4d9b0214643318f15628b89023e7e6feced132f4eae4a71b8e1444baf68b8a"} err="failed to get container status \"be4d9b0214643318f15628b89023e7e6feced132f4eae4a71b8e1444baf68b8a\": rpc error: code = NotFound desc = could not find container \"be4d9b0214643318f15628b89023e7e6feced132f4eae4a71b8e1444baf68b8a\": container with ID starting with be4d9b0214643318f15628b89023e7e6feced132f4eae4a71b8e1444baf68b8a not found: ID does not exist" Jan 31 09:29:19 crc kubenswrapper[4992]: E0131 09:29:19.294443 4992 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" interval="3.2s" Jan 31 09:29:22 crc kubenswrapper[4992]: I0131 09:29:22.260162 4992 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 31 09:29:22 crc kubenswrapper[4992]: I0131 09:29:22.260288 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 31 09:29:22 crc kubenswrapper[4992]: E0131 09:29:22.276115 4992 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.243:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188fc6c9ff15eac0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 09:29:09.450246848 +0000 UTC m=+245.421638835,LastTimestamp:2026-01-31 09:29:09.450246848 +0000 UTC m=+245.421638835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 09:29:22 crc kubenswrapper[4992]: E0131 09:29:22.495948 4992 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.243:6443: connect: connection refused" interval="6.4s" Jan 31 09:29:22 crc kubenswrapper[4992]: I0131 09:29:22.843794 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 31 09:29:22 crc kubenswrapper[4992]: I0131 09:29:22.844079 4992 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255" exitCode=1 Jan 31 09:29:22 crc kubenswrapper[4992]: I0131 09:29:22.844176 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255"} Jan 31 09:29:22 crc kubenswrapper[4992]: I0131 09:29:22.844810 4992 scope.go:117] "RemoveContainer" containerID="03f8b8a806bb67f9013d8f746c8b6dd57b3b5695848f7c78856d2877ddf81255" Jan 31 09:29:22 crc kubenswrapper[4992]: I0131 09:29:22.845644 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:22 crc kubenswrapper[4992]: I0131 09:29:22.846306 4992 status_manager.go:851] "Failed to get status for pod" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-vktdq\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:22 crc kubenswrapper[4992]: I0131 09:29:22.846867 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:22 crc kubenswrapper[4992]: I0131 09:29:22.847577 4992 status_manager.go:851] "Failed to get status for pod" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" pod="openshift-marketplace/certified-operators-j9mrh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-j9mrh\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:22 crc kubenswrapper[4992]: I0131 09:29:22.848109 4992 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:23 crc kubenswrapper[4992]: I0131 09:29:23.182351 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:23 crc kubenswrapper[4992]: I0131 09:29:23.184268 4992 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:23 crc kubenswrapper[4992]: I0131 09:29:23.184854 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:23 crc kubenswrapper[4992]: I0131 09:29:23.185512 4992 status_manager.go:851] "Failed to get status for pod" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-vktdq\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:23 crc kubenswrapper[4992]: I0131 09:29:23.185958 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:23 crc kubenswrapper[4992]: I0131 09:29:23.186259 4992 status_manager.go:851] "Failed to get status for pod" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" pod="openshift-marketplace/certified-operators-j9mrh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-j9mrh\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:23 crc kubenswrapper[4992]: I0131 09:29:23.197734 4992 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b08c2628-89dd-47c3-9c25-7799a63c225b" Jan 31 09:29:23 crc kubenswrapper[4992]: I0131 09:29:23.197767 4992 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b08c2628-89dd-47c3-9c25-7799a63c225b" Jan 31 09:29:23 crc kubenswrapper[4992]: E0131 09:29:23.198212 4992 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:23 crc kubenswrapper[4992]: I0131 09:29:23.198709 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:23 crc kubenswrapper[4992]: W0131 09:29:23.218508 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-2b4c14ad9ff71a0acf16d68391bbd01bd5d1d79bdcc865c2615e82ca301ea169 WatchSource:0}: Error finding container 2b4c14ad9ff71a0acf16d68391bbd01bd5d1d79bdcc865c2615e82ca301ea169: Status 404 returned error can't find the container with id 2b4c14ad9ff71a0acf16d68391bbd01bd5d1d79bdcc865c2615e82ca301ea169 Jan 31 09:29:23 crc kubenswrapper[4992]: I0131 09:29:23.858023 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2b4c14ad9ff71a0acf16d68391bbd01bd5d1d79bdcc865c2615e82ca301ea169"} Jan 31 09:29:23 crc kubenswrapper[4992]: I0131 09:29:23.863739 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 31 09:29:23 crc kubenswrapper[4992]: I0131 09:29:23.863819 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"895eca80c93d6decffcc17c39c91e2085f6ddaabd8e62867e0e338dc4d89ccb0"} Jan 31 09:29:24 crc kubenswrapper[4992]: I0131 09:29:24.214253 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:29:24 crc kubenswrapper[4992]: I0131 09:29:24.873454 4992 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="c0278e9179cead3e5518c7e1b690059ff37429f044b7191d5a25c5c0f41ac80b" exitCode=0 Jan 31 09:29:24 crc kubenswrapper[4992]: I0131 09:29:24.873596 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"c0278e9179cead3e5518c7e1b690059ff37429f044b7191d5a25c5c0f41ac80b"} Jan 31 09:29:24 crc kubenswrapper[4992]: I0131 09:29:24.873887 4992 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b08c2628-89dd-47c3-9c25-7799a63c225b" Jan 31 09:29:24 crc kubenswrapper[4992]: I0131 09:29:24.873928 4992 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b08c2628-89dd-47c3-9c25-7799a63c225b" Jan 31 09:29:24 crc kubenswrapper[4992]: E0131 09:29:24.874539 4992 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:24 crc kubenswrapper[4992]: I0131 09:29:24.874625 4992 status_manager.go:851] "Failed to get status for pod" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-vktdq\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:24 crc kubenswrapper[4992]: I0131 09:29:24.874897 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:24 crc kubenswrapper[4992]: I0131 09:29:24.875110 4992 status_manager.go:851] "Failed to get status for pod" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" pod="openshift-marketplace/certified-operators-j9mrh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-j9mrh\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:24 crc kubenswrapper[4992]: I0131 09:29:24.875328 4992 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:24 crc kubenswrapper[4992]: I0131 09:29:24.875628 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:24 crc kubenswrapper[4992]: I0131 09:29:24.875869 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:24 crc kubenswrapper[4992]: I0131 09:29:24.876079 4992 status_manager.go:851] "Failed to get status for pod" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-vktdq\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:24 crc kubenswrapper[4992]: I0131 09:29:24.876288 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:24 crc kubenswrapper[4992]: I0131 09:29:24.876665 4992 status_manager.go:851] "Failed to get status for pod" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" pod="openshift-marketplace/certified-operators-j9mrh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-j9mrh\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:24 crc kubenswrapper[4992]: I0131 09:29:24.876908 4992 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:25 crc kubenswrapper[4992]: I0131 09:29:25.189554 4992 status_manager.go:851] "Failed to get status for pod" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" pod="openshift-authentication/oauth-openshift-558db77b4-vktdq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-vktdq\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:25 crc kubenswrapper[4992]: I0131 09:29:25.189980 4992 status_manager.go:851] "Failed to get status for pod" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" pod="openshift-marketplace/redhat-operators-pt48t" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-pt48t\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:25 crc kubenswrapper[4992]: I0131 09:29:25.190467 4992 status_manager.go:851] "Failed to get status for pod" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" pod="openshift-marketplace/certified-operators-j9mrh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-j9mrh\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:25 crc kubenswrapper[4992]: I0131 09:29:25.190810 4992 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:25 crc kubenswrapper[4992]: I0131 09:29:25.191152 4992 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:25 crc kubenswrapper[4992]: I0131 09:29:25.191557 4992 status_manager.go:851] "Failed to get status for pod" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.243:6443: connect: connection refused" Jan 31 09:29:25 crc kubenswrapper[4992]: I0131 09:29:25.882473 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e6ed78d0a173c8bb0232ea14cb5b67d4c9c0a47be9e063e6f97510660e80c6f9"} Jan 31 09:29:25 crc kubenswrapper[4992]: I0131 09:29:25.882826 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4e23d38674e045df38c67dad80a80016da263f65e987133a57e25abd2e394b41"} Jan 31 09:29:26 crc kubenswrapper[4992]: I0131 09:29:26.890487 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d730412177cf0f2fcd84315bed451fe41fd092cce5afba0c942d2bbf06ccb1e7"} Jan 31 09:29:26 crc kubenswrapper[4992]: I0131 09:29:26.890818 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"31700fb3e1e8efb852f66c192469c4a3e6adc7e827ad2198e38c970e8c452469"} Jan 31 09:29:26 crc kubenswrapper[4992]: I0131 09:29:26.890840 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:26 crc kubenswrapper[4992]: I0131 09:29:26.890854 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"17b870682e989259d9535173ceb7664af4d4659771f368734dc88cc01745007e"} Jan 31 09:29:26 crc kubenswrapper[4992]: I0131 09:29:26.890874 4992 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b08c2628-89dd-47c3-9c25-7799a63c225b" Jan 31 09:29:26 crc kubenswrapper[4992]: I0131 09:29:26.890905 4992 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b08c2628-89dd-47c3-9c25-7799a63c225b" Jan 31 09:29:28 crc kubenswrapper[4992]: I0131 09:29:28.200021 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:28 crc kubenswrapper[4992]: I0131 09:29:28.200449 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:28 crc kubenswrapper[4992]: I0131 09:29:28.205601 4992 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]log ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]etcd ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/openshift.io-api-request-count-filter ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/openshift.io-startkubeinformers ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/generic-apiserver-start-informers ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/priority-and-fairness-config-consumer ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/priority-and-fairness-filter ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/start-apiextensions-informers ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/start-apiextensions-controllers ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/crd-informer-synced ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/start-system-namespaces-controller ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/start-cluster-authentication-info-controller ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/start-legacy-token-tracking-controller ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/start-service-ip-repair-controllers ok Jan 31 09:29:28 crc kubenswrapper[4992]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Jan 31 09:29:28 crc kubenswrapper[4992]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/priority-and-fairness-config-producer ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/bootstrap-controller ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/start-kube-aggregator-informers ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/apiservice-status-local-available-controller ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/apiservice-status-remote-available-controller ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/apiservice-registration-controller ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/apiservice-wait-for-first-sync ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/apiservice-discovery-controller ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/kube-apiserver-autoregistration ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]autoregister-completion ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/apiservice-openapi-controller ok Jan 31 09:29:28 crc kubenswrapper[4992]: [+]poststarthook/apiservice-openapiv3-controller ok Jan 31 09:29:28 crc kubenswrapper[4992]: livez check failed Jan 31 09:29:28 crc kubenswrapper[4992]: I0131 09:29:28.205700 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:29:32 crc kubenswrapper[4992]: I0131 09:29:32.245514 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:29:32 crc kubenswrapper[4992]: I0131 09:29:32.246153 4992 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 31 09:29:32 crc kubenswrapper[4992]: I0131 09:29:32.246366 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 31 09:29:32 crc kubenswrapper[4992]: I0131 09:29:32.297630 4992 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:32 crc kubenswrapper[4992]: I0131 09:29:32.330033 4992 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="327bebd5-c523-494f-a8c4-a78e85f56e38" Jan 31 09:29:32 crc kubenswrapper[4992]: I0131 09:29:32.921309 4992 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b08c2628-89dd-47c3-9c25-7799a63c225b" Jan 31 09:29:32 crc kubenswrapper[4992]: I0131 09:29:32.921338 4992 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b08c2628-89dd-47c3-9c25-7799a63c225b" Jan 31 09:29:32 crc kubenswrapper[4992]: I0131 09:29:32.924965 4992 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="327bebd5-c523-494f-a8c4-a78e85f56e38" Jan 31 09:29:34 crc kubenswrapper[4992]: I0131 09:29:34.214779 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:29:41 crc kubenswrapper[4992]: I0131 09:29:41.556671 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 31 09:29:42 crc kubenswrapper[4992]: I0131 09:29:42.249131 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:29:42 crc kubenswrapper[4992]: I0131 09:29:42.254254 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:29:42 crc kubenswrapper[4992]: I0131 09:29:42.521886 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 31 09:29:42 crc kubenswrapper[4992]: I0131 09:29:42.731219 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 31 09:29:43 crc kubenswrapper[4992]: I0131 09:29:43.580544 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 31 09:29:43 crc kubenswrapper[4992]: I0131 09:29:43.669190 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 31 09:29:43 crc kubenswrapper[4992]: I0131 09:29:43.747948 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 31 09:29:43 crc kubenswrapper[4992]: I0131 09:29:43.984923 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 31 09:29:44 crc kubenswrapper[4992]: I0131 09:29:44.138822 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 31 09:29:44 crc kubenswrapper[4992]: I0131 09:29:44.210259 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 31 09:29:44 crc kubenswrapper[4992]: I0131 09:29:44.215930 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 31 09:29:44 crc kubenswrapper[4992]: I0131 09:29:44.345929 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 31 09:29:44 crc kubenswrapper[4992]: I0131 09:29:44.787992 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 31 09:29:44 crc kubenswrapper[4992]: I0131 09:29:44.819222 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 31 09:29:44 crc kubenswrapper[4992]: I0131 09:29:44.865682 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 31 09:29:45 crc kubenswrapper[4992]: I0131 09:29:45.059997 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 31 09:29:45 crc kubenswrapper[4992]: I0131 09:29:45.107673 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 31 09:29:45 crc kubenswrapper[4992]: I0131 09:29:45.337292 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 31 09:29:45 crc kubenswrapper[4992]: I0131 09:29:45.400031 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 09:29:45 crc kubenswrapper[4992]: I0131 09:29:45.419895 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 31 09:29:45 crc kubenswrapper[4992]: I0131 09:29:45.502197 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 31 09:29:45 crc kubenswrapper[4992]: I0131 09:29:45.627268 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 09:29:45 crc kubenswrapper[4992]: I0131 09:29:45.733672 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 09:29:45 crc kubenswrapper[4992]: I0131 09:29:45.781744 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 31 09:29:45 crc kubenswrapper[4992]: I0131 09:29:45.800705 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.070214 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.118139 4992 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.131101 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.152599 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.193572 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.199912 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.339958 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.483558 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.542081 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.545678 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.591602 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.607026 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.639725 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.847533 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.847904 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.851756 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.866842 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 31 09:29:46 crc kubenswrapper[4992]: I0131 09:29:46.937662 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 09:29:47 crc kubenswrapper[4992]: I0131 09:29:47.071175 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 31 09:29:47 crc kubenswrapper[4992]: I0131 09:29:47.187299 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 31 09:29:47 crc kubenswrapper[4992]: I0131 09:29:47.198155 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 31 09:29:47 crc kubenswrapper[4992]: I0131 09:29:47.265308 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 31 09:29:47 crc kubenswrapper[4992]: I0131 09:29:47.282413 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 31 09:29:47 crc kubenswrapper[4992]: I0131 09:29:47.326054 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 31 09:29:47 crc kubenswrapper[4992]: I0131 09:29:47.405782 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 31 09:29:47 crc kubenswrapper[4992]: I0131 09:29:47.451064 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 31 09:29:47 crc kubenswrapper[4992]: I0131 09:29:47.461893 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 31 09:29:47 crc kubenswrapper[4992]: I0131 09:29:47.709386 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 31 09:29:47 crc kubenswrapper[4992]: I0131 09:29:47.836019 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 31 09:29:47 crc kubenswrapper[4992]: I0131 09:29:47.959000 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 31 09:29:47 crc kubenswrapper[4992]: I0131 09:29:47.977166 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 31 09:29:47 crc kubenswrapper[4992]: I0131 09:29:47.988670 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 31 09:29:48 crc kubenswrapper[4992]: I0131 09:29:48.156695 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 31 09:29:48 crc kubenswrapper[4992]: I0131 09:29:48.305216 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 31 09:29:48 crc kubenswrapper[4992]: I0131 09:29:48.328578 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 31 09:29:48 crc kubenswrapper[4992]: I0131 09:29:48.366491 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 31 09:29:48 crc kubenswrapper[4992]: I0131 09:29:48.450139 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 31 09:29:48 crc kubenswrapper[4992]: I0131 09:29:48.616376 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 31 09:29:48 crc kubenswrapper[4992]: I0131 09:29:48.633748 4992 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 31 09:29:48 crc kubenswrapper[4992]: I0131 09:29:48.721217 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 31 09:29:48 crc kubenswrapper[4992]: I0131 09:29:48.947547 4992 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.040216 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.092614 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.122060 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.143848 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.163734 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.247865 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.333677 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.425252 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.452970 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.458002 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.520823 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.778623 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.794478 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.830974 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.898273 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.936764 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 31 09:29:49 crc kubenswrapper[4992]: I0131 09:29:49.938874 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.072972 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.088404 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.095336 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.116438 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.126908 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.138657 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.192602 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.229118 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.303456 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.334874 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.569587 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.592968 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.635508 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.679950 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.694178 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.723110 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.846952 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.891482 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.949162 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 31 09:29:50 crc kubenswrapper[4992]: I0131 09:29:50.984710 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 31 09:29:51 crc kubenswrapper[4992]: I0131 09:29:51.199245 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 31 09:29:51 crc kubenswrapper[4992]: I0131 09:29:51.271110 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 31 09:29:51 crc kubenswrapper[4992]: I0131 09:29:51.398792 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 31 09:29:51 crc kubenswrapper[4992]: I0131 09:29:51.506053 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 31 09:29:51 crc kubenswrapper[4992]: I0131 09:29:51.580350 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 31 09:29:51 crc kubenswrapper[4992]: I0131 09:29:51.598519 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 09:29:51 crc kubenswrapper[4992]: I0131 09:29:51.751987 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 31 09:29:51 crc kubenswrapper[4992]: I0131 09:29:51.773238 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.014521 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.022216 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.126956 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.138039 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.138210 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.152410 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.206015 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.255625 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.362835 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.431532 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.531807 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.557467 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.558285 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.577969 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.647257 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.738481 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 31 09:29:52 crc kubenswrapper[4992]: I0131 09:29:52.862959 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.040164 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.046831 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.137745 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.158895 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.258557 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.259814 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.261786 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.268250 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.333533 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.354007 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.376524 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.392845 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.409356 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.452762 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.490289 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.529415 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.587708 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.599013 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.648200 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.769538 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.773962 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.801660 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.837125 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.875307 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.885255 4992 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.905644 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.959644 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.970311 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 31 09:29:53 crc kubenswrapper[4992]: I0131 09:29:53.991374 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.050855 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.058024 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.058095 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.071773 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.149936 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.165819 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.249112 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.371583 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.449556 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.540510 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.556799 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.572273 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.615306 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.640343 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.746229 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.802772 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.865105 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.866483 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.901597 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.919931 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 31 09:29:54 crc kubenswrapper[4992]: I0131 09:29:54.920327 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 31 09:29:55 crc kubenswrapper[4992]: I0131 09:29:55.150671 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 31 09:29:55 crc kubenswrapper[4992]: I0131 09:29:55.153216 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 31 09:29:55 crc kubenswrapper[4992]: I0131 09:29:55.212218 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 31 09:29:55 crc kubenswrapper[4992]: I0131 09:29:55.220011 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 31 09:29:55 crc kubenswrapper[4992]: I0131 09:29:55.224882 4992 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 31 09:29:55 crc kubenswrapper[4992]: I0131 09:29:55.238141 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 09:29:55 crc kubenswrapper[4992]: I0131 09:29:55.359452 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 31 09:29:55 crc kubenswrapper[4992]: I0131 09:29:55.409906 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 31 09:29:55 crc kubenswrapper[4992]: I0131 09:29:55.413772 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 31 09:29:55 crc kubenswrapper[4992]: I0131 09:29:55.479151 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 31 09:29:55 crc kubenswrapper[4992]: I0131 09:29:55.503109 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 31 09:29:55 crc kubenswrapper[4992]: I0131 09:29:55.557356 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 31 09:29:55 crc kubenswrapper[4992]: I0131 09:29:55.596331 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 31 09:29:55 crc kubenswrapper[4992]: I0131 09:29:55.616836 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 31 09:29:55 crc kubenswrapper[4992]: I0131 09:29:55.987402 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 31 09:29:56 crc kubenswrapper[4992]: I0131 09:29:56.024605 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 31 09:29:56 crc kubenswrapper[4992]: I0131 09:29:56.147619 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 31 09:29:56 crc kubenswrapper[4992]: I0131 09:29:56.154151 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 31 09:29:56 crc kubenswrapper[4992]: I0131 09:29:56.237726 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 31 09:29:56 crc kubenswrapper[4992]: I0131 09:29:56.291813 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 31 09:29:56 crc kubenswrapper[4992]: I0131 09:29:56.309366 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 31 09:29:56 crc kubenswrapper[4992]: I0131 09:29:56.358215 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 31 09:29:56 crc kubenswrapper[4992]: I0131 09:29:56.936988 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 31 09:29:56 crc kubenswrapper[4992]: I0131 09:29:56.947977 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 31 09:29:56 crc kubenswrapper[4992]: I0131 09:29:56.988792 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 31 09:29:56 crc kubenswrapper[4992]: I0131 09:29:56.999031 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.021676 4992 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.026252 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vktdq","openshift-marketplace/certified-operators-j9mrh","openshift-marketplace/redhat-operators-pt48t","openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.026323 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.031064 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.044091 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.044077493 podStartE2EDuration="25.044077493s" podCreationTimestamp="2026-01-31 09:29:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:29:57.042794705 +0000 UTC m=+293.014186702" watchObservedRunningTime="2026-01-31 09:29:57.044077493 +0000 UTC m=+293.015469480" Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.065614 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.189311 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0377ed6d-ea6e-44cb-9d09-0c817af64b22" path="/var/lib/kubelet/pods/0377ed6d-ea6e-44cb-9d09-0c817af64b22/volumes" Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.190193 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" path="/var/lib/kubelet/pods/6a074f4f-b7f6-4892-be89-083d619c0771/volumes" Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.190852 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" path="/var/lib/kubelet/pods/db3860c3-37de-4fa5-9c79-965abd0e2149/volumes" Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.200950 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.301350 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.523188 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.605897 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.606815 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.679240 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.724986 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 31 09:29:57 crc kubenswrapper[4992]: I0131 09:29:57.750110 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 31 09:29:58 crc kubenswrapper[4992]: I0131 09:29:58.119471 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 31 09:29:58 crc kubenswrapper[4992]: I0131 09:29:58.204197 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:58 crc kubenswrapper[4992]: I0131 09:29:58.208558 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:29:58 crc kubenswrapper[4992]: I0131 09:29:58.291033 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 31 09:29:58 crc kubenswrapper[4992]: I0131 09:29:58.591119 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.177089 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5"] Jan 31 09:30:00 crc kubenswrapper[4992]: E0131 09:30:00.177320 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" containerName="extract-utilities" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.177334 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" containerName="extract-utilities" Jan 31 09:30:00 crc kubenswrapper[4992]: E0131 09:30:00.177344 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" containerName="oauth-openshift" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.177351 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" containerName="oauth-openshift" Jan 31 09:30:00 crc kubenswrapper[4992]: E0131 09:30:00.177368 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" containerName="extract-content" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.177377 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" containerName="extract-content" Jan 31 09:30:00 crc kubenswrapper[4992]: E0131 09:30:00.177391 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" containerName="installer" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.177397 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" containerName="installer" Jan 31 09:30:00 crc kubenswrapper[4992]: E0131 09:30:00.177406 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" containerName="registry-server" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.177466 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" containerName="registry-server" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.177568 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="db3860c3-37de-4fa5-9c79-965abd0e2149" containerName="oauth-openshift" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.177580 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7fa5f9b-a33f-4527-970e-c7912dbdc9f0" containerName="installer" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.177585 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a074f4f-b7f6-4892-be89-083d619c0771" containerName="registry-server" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.177936 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.179578 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.179676 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.186767 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5"] Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.335476 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2732f7b0-a210-4d8a-82a1-952cabafab5d-config-volume\") pod \"collect-profiles-29497530-m2dt5\" (UID: \"2732f7b0-a210-4d8a-82a1-952cabafab5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.335601 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsm47\" (UniqueName: \"kubernetes.io/projected/2732f7b0-a210-4d8a-82a1-952cabafab5d-kube-api-access-nsm47\") pod \"collect-profiles-29497530-m2dt5\" (UID: \"2732f7b0-a210-4d8a-82a1-952cabafab5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.335654 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2732f7b0-a210-4d8a-82a1-952cabafab5d-secret-volume\") pod \"collect-profiles-29497530-m2dt5\" (UID: \"2732f7b0-a210-4d8a-82a1-952cabafab5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.436653 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsm47\" (UniqueName: \"kubernetes.io/projected/2732f7b0-a210-4d8a-82a1-952cabafab5d-kube-api-access-nsm47\") pod \"collect-profiles-29497530-m2dt5\" (UID: \"2732f7b0-a210-4d8a-82a1-952cabafab5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.436741 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2732f7b0-a210-4d8a-82a1-952cabafab5d-secret-volume\") pod \"collect-profiles-29497530-m2dt5\" (UID: \"2732f7b0-a210-4d8a-82a1-952cabafab5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.436807 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2732f7b0-a210-4d8a-82a1-952cabafab5d-config-volume\") pod \"collect-profiles-29497530-m2dt5\" (UID: \"2732f7b0-a210-4d8a-82a1-952cabafab5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.438154 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2732f7b0-a210-4d8a-82a1-952cabafab5d-config-volume\") pod \"collect-profiles-29497530-m2dt5\" (UID: \"2732f7b0-a210-4d8a-82a1-952cabafab5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.443402 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2732f7b0-a210-4d8a-82a1-952cabafab5d-secret-volume\") pod \"collect-profiles-29497530-m2dt5\" (UID: \"2732f7b0-a210-4d8a-82a1-952cabafab5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.452311 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsm47\" (UniqueName: \"kubernetes.io/projected/2732f7b0-a210-4d8a-82a1-952cabafab5d-kube-api-access-nsm47\") pod \"collect-profiles-29497530-m2dt5\" (UID: \"2732f7b0-a210-4d8a-82a1-952cabafab5d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.494956 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" Jan 31 09:30:00 crc kubenswrapper[4992]: I0131 09:30:00.709791 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5"] Jan 31 09:30:00 crc kubenswrapper[4992]: W0131 09:30:00.715399 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2732f7b0_a210_4d8a_82a1_952cabafab5d.slice/crio-d1350e92c757a2996be2ad356b2487a39361a8a587bc9711cbc1f384aaa8ccbe WatchSource:0}: Error finding container d1350e92c757a2996be2ad356b2487a39361a8a587bc9711cbc1f384aaa8ccbe: Status 404 returned error can't find the container with id d1350e92c757a2996be2ad356b2487a39361a8a587bc9711cbc1f384aaa8ccbe Jan 31 09:30:01 crc kubenswrapper[4992]: I0131 09:30:01.073439 4992 generic.go:334] "Generic (PLEG): container finished" podID="2732f7b0-a210-4d8a-82a1-952cabafab5d" containerID="5f0a48802e49e695dfdea7d957e53147c205f4405303ce5f48ab323d4d287758" exitCode=0 Jan 31 09:30:01 crc kubenswrapper[4992]: I0131 09:30:01.073514 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" event={"ID":"2732f7b0-a210-4d8a-82a1-952cabafab5d","Type":"ContainerDied","Data":"5f0a48802e49e695dfdea7d957e53147c205f4405303ce5f48ab323d4d287758"} Jan 31 09:30:01 crc kubenswrapper[4992]: I0131 09:30:01.073754 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" event={"ID":"2732f7b0-a210-4d8a-82a1-952cabafab5d","Type":"ContainerStarted","Data":"d1350e92c757a2996be2ad356b2487a39361a8a587bc9711cbc1f384aaa8ccbe"} Jan 31 09:30:02 crc kubenswrapper[4992]: I0131 09:30:02.310172 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" Jan 31 09:30:02 crc kubenswrapper[4992]: I0131 09:30:02.460746 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2732f7b0-a210-4d8a-82a1-952cabafab5d-secret-volume\") pod \"2732f7b0-a210-4d8a-82a1-952cabafab5d\" (UID: \"2732f7b0-a210-4d8a-82a1-952cabafab5d\") " Jan 31 09:30:02 crc kubenswrapper[4992]: I0131 09:30:02.460809 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2732f7b0-a210-4d8a-82a1-952cabafab5d-config-volume\") pod \"2732f7b0-a210-4d8a-82a1-952cabafab5d\" (UID: \"2732f7b0-a210-4d8a-82a1-952cabafab5d\") " Jan 31 09:30:02 crc kubenswrapper[4992]: I0131 09:30:02.460895 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsm47\" (UniqueName: \"kubernetes.io/projected/2732f7b0-a210-4d8a-82a1-952cabafab5d-kube-api-access-nsm47\") pod \"2732f7b0-a210-4d8a-82a1-952cabafab5d\" (UID: \"2732f7b0-a210-4d8a-82a1-952cabafab5d\") " Jan 31 09:30:02 crc kubenswrapper[4992]: I0131 09:30:02.461498 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2732f7b0-a210-4d8a-82a1-952cabafab5d-config-volume" (OuterVolumeSpecName: "config-volume") pod "2732f7b0-a210-4d8a-82a1-952cabafab5d" (UID: "2732f7b0-a210-4d8a-82a1-952cabafab5d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:02 crc kubenswrapper[4992]: I0131 09:30:02.465479 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2732f7b0-a210-4d8a-82a1-952cabafab5d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2732f7b0-a210-4d8a-82a1-952cabafab5d" (UID: "2732f7b0-a210-4d8a-82a1-952cabafab5d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:02 crc kubenswrapper[4992]: I0131 09:30:02.465823 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2732f7b0-a210-4d8a-82a1-952cabafab5d-kube-api-access-nsm47" (OuterVolumeSpecName: "kube-api-access-nsm47") pod "2732f7b0-a210-4d8a-82a1-952cabafab5d" (UID: "2732f7b0-a210-4d8a-82a1-952cabafab5d"). InnerVolumeSpecName "kube-api-access-nsm47". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:30:02 crc kubenswrapper[4992]: I0131 09:30:02.562165 4992 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2732f7b0-a210-4d8a-82a1-952cabafab5d-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:02 crc kubenswrapper[4992]: I0131 09:30:02.562206 4992 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2732f7b0-a210-4d8a-82a1-952cabafab5d-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:02 crc kubenswrapper[4992]: I0131 09:30:02.562217 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsm47\" (UniqueName: \"kubernetes.io/projected/2732f7b0-a210-4d8a-82a1-952cabafab5d-kube-api-access-nsm47\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:03 crc kubenswrapper[4992]: I0131 09:30:03.086305 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" event={"ID":"2732f7b0-a210-4d8a-82a1-952cabafab5d","Type":"ContainerDied","Data":"d1350e92c757a2996be2ad356b2487a39361a8a587bc9711cbc1f384aaa8ccbe"} Jan 31 09:30:03 crc kubenswrapper[4992]: I0131 09:30:03.086347 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1350e92c757a2996be2ad356b2487a39361a8a587bc9711cbc1f384aaa8ccbe" Jan 31 09:30:03 crc kubenswrapper[4992]: I0131 09:30:03.086387 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5" Jan 31 09:30:04 crc kubenswrapper[4992]: I0131 09:30:04.962996 4992 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.067858 4992 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.068165 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://2290a3bfe92f649cc23de6ee848d4fbadc0338a6382f4f92ea6442197a765a89" gracePeriod=5 Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.644461 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5bd8644659-pn227"] Jan 31 09:30:05 crc kubenswrapper[4992]: E0131 09:30:05.644704 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.644723 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 31 09:30:05 crc kubenswrapper[4992]: E0131 09:30:05.644745 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2732f7b0-a210-4d8a-82a1-952cabafab5d" containerName="collect-profiles" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.644754 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="2732f7b0-a210-4d8a-82a1-952cabafab5d" containerName="collect-profiles" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.644865 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="2732f7b0-a210-4d8a-82a1-952cabafab5d" containerName="collect-profiles" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.644878 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.645518 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.651568 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.651994 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.652106 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.652124 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.652182 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.652503 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.652571 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.652625 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.652630 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.652921 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.658485 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5bd8644659-pn227"] Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.658742 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.675190 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.689938 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.690225 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.690751 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.802687 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/359eb401-8f21-4b94-9f7c-8d78265e91ee-audit-policies\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.802729 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/359eb401-8f21-4b94-9f7c-8d78265e91ee-audit-dir\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.802773 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.802792 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.802819 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.802870 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-session\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.802889 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-router-certs\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.802902 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-user-template-login\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.802966 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.803038 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-user-template-error\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.803058 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.803093 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn4x2\" (UniqueName: \"kubernetes.io/projected/359eb401-8f21-4b94-9f7c-8d78265e91ee-kube-api-access-qn4x2\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.803129 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-service-ca\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.803173 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.904488 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.904556 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.904590 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.904629 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-session\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.904651 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-router-certs\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.904811 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-user-template-login\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.904866 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.904947 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-user-template-error\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.904970 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.905005 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn4x2\" (UniqueName: \"kubernetes.io/projected/359eb401-8f21-4b94-9f7c-8d78265e91ee-kube-api-access-qn4x2\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.905030 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-service-ca\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.905067 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.905093 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/359eb401-8f21-4b94-9f7c-8d78265e91ee-audit-policies\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.905109 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/359eb401-8f21-4b94-9f7c-8d78265e91ee-audit-dir\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.905237 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/359eb401-8f21-4b94-9f7c-8d78265e91ee-audit-dir\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.905886 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.907465 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-service-ca\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.907654 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/359eb401-8f21-4b94-9f7c-8d78265e91ee-audit-policies\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.908119 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.909659 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.910179 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-user-template-login\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.911278 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-router-certs\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.911323 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.917571 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-user-template-error\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.917817 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.924467 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.928197 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/359eb401-8f21-4b94-9f7c-8d78265e91ee-v4-0-config-system-session\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:05 crc kubenswrapper[4992]: I0131 09:30:05.939142 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn4x2\" (UniqueName: \"kubernetes.io/projected/359eb401-8f21-4b94-9f7c-8d78265e91ee-kube-api-access-qn4x2\") pod \"oauth-openshift-5bd8644659-pn227\" (UID: \"359eb401-8f21-4b94-9f7c-8d78265e91ee\") " pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:06 crc kubenswrapper[4992]: I0131 09:30:06.016184 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:06 crc kubenswrapper[4992]: I0131 09:30:06.406826 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5bd8644659-pn227"] Jan 31 09:30:07 crc kubenswrapper[4992]: I0131 09:30:07.113543 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" event={"ID":"359eb401-8f21-4b94-9f7c-8d78265e91ee","Type":"ContainerStarted","Data":"2efc9442bc127bedfc380ae56b19ccb7a4253976304e3fbfc55fcac67de54520"} Jan 31 09:30:07 crc kubenswrapper[4992]: I0131 09:30:07.113593 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" event={"ID":"359eb401-8f21-4b94-9f7c-8d78265e91ee","Type":"ContainerStarted","Data":"8a6989874dd6fa405b4a1bb83d00c900b8faba6bd80c789d50b9266e1272ea06"} Jan 31 09:30:07 crc kubenswrapper[4992]: I0131 09:30:07.113949 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:07 crc kubenswrapper[4992]: I0131 09:30:07.118727 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" Jan 31 09:30:07 crc kubenswrapper[4992]: I0131 09:30:07.138176 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5bd8644659-pn227" podStartSLOduration=75.138156913 podStartE2EDuration="1m15.138156913s" podCreationTimestamp="2026-01-31 09:28:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:30:07.133688668 +0000 UTC m=+303.105080695" watchObservedRunningTime="2026-01-31 09:30:07.138156913 +0000 UTC m=+303.109548910" Jan 31 09:30:07 crc kubenswrapper[4992]: I0131 09:30:07.575646 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.035929 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.129598 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.129652 4992 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="2290a3bfe92f649cc23de6ee848d4fbadc0338a6382f4f92ea6442197a765a89" exitCode=137 Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.645120 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.645213 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.783309 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.783489 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.783532 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.783525 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.783568 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.783586 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.783609 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.783627 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.783716 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.784201 4992 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.784241 4992 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.784261 4992 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.784279 4992 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.795605 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:30:10 crc kubenswrapper[4992]: I0131 09:30:10.885693 4992 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:11 crc kubenswrapper[4992]: I0131 09:30:11.140697 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 31 09:30:11 crc kubenswrapper[4992]: I0131 09:30:11.140813 4992 scope.go:117] "RemoveContainer" containerID="2290a3bfe92f649cc23de6ee848d4fbadc0338a6382f4f92ea6442197a765a89" Jan 31 09:30:11 crc kubenswrapper[4992]: I0131 09:30:11.140969 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:30:11 crc kubenswrapper[4992]: I0131 09:30:11.198052 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 31 09:30:12 crc kubenswrapper[4992]: I0131 09:30:12.171560 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 31 09:30:12 crc kubenswrapper[4992]: I0131 09:30:12.567668 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.228972 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f998f6c97-kmgqq"] Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.230215 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" podUID="a505a892-3153-4dfe-abdb-c83998d795c1" containerName="controller-manager" containerID="cri-o://09ab3f7b82266aa4037a7eb3692ecbe518fe10564ce1ffaebbc91da904ce1ecb" gracePeriod=30 Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.327697 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v"] Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.328246 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" podUID="b194d796-8bc3-4599-905b-16c28f19f7f4" containerName="route-controller-manager" containerID="cri-o://a83bd5bd79bf123f7716fcdbdc52eca42f23dca88b6302dfe4fa8f9ccf996779" gracePeriod=30 Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.595460 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.683198 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.732848 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-proxy-ca-bundles\") pod \"a505a892-3153-4dfe-abdb-c83998d795c1\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.732923 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-client-ca\") pod \"a505a892-3153-4dfe-abdb-c83998d795c1\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.732959 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a505a892-3153-4dfe-abdb-c83998d795c1-serving-cert\") pod \"a505a892-3153-4dfe-abdb-c83998d795c1\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.733031 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-config\") pod \"a505a892-3153-4dfe-abdb-c83998d795c1\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.733710 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-client-ca" (OuterVolumeSpecName: "client-ca") pod "a505a892-3153-4dfe-abdb-c83998d795c1" (UID: "a505a892-3153-4dfe-abdb-c83998d795c1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.733774 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a505a892-3153-4dfe-abdb-c83998d795c1" (UID: "a505a892-3153-4dfe-abdb-c83998d795c1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.733787 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-config" (OuterVolumeSpecName: "config") pod "a505a892-3153-4dfe-abdb-c83998d795c1" (UID: "a505a892-3153-4dfe-abdb-c83998d795c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.733995 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bhhg\" (UniqueName: \"kubernetes.io/projected/a505a892-3153-4dfe-abdb-c83998d795c1-kube-api-access-2bhhg\") pod \"a505a892-3153-4dfe-abdb-c83998d795c1\" (UID: \"a505a892-3153-4dfe-abdb-c83998d795c1\") " Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.734403 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.734441 4992 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.734451 4992 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a505a892-3153-4dfe-abdb-c83998d795c1-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.738341 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a505a892-3153-4dfe-abdb-c83998d795c1-kube-api-access-2bhhg" (OuterVolumeSpecName: "kube-api-access-2bhhg") pod "a505a892-3153-4dfe-abdb-c83998d795c1" (UID: "a505a892-3153-4dfe-abdb-c83998d795c1"). InnerVolumeSpecName "kube-api-access-2bhhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.738406 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a505a892-3153-4dfe-abdb-c83998d795c1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a505a892-3153-4dfe-abdb-c83998d795c1" (UID: "a505a892-3153-4dfe-abdb-c83998d795c1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.835208 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b194d796-8bc3-4599-905b-16c28f19f7f4-client-ca\") pod \"b194d796-8bc3-4599-905b-16c28f19f7f4\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.835319 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4jj8\" (UniqueName: \"kubernetes.io/projected/b194d796-8bc3-4599-905b-16c28f19f7f4-kube-api-access-k4jj8\") pod \"b194d796-8bc3-4599-905b-16c28f19f7f4\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.835374 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b194d796-8bc3-4599-905b-16c28f19f7f4-serving-cert\") pod \"b194d796-8bc3-4599-905b-16c28f19f7f4\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.835499 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b194d796-8bc3-4599-905b-16c28f19f7f4-config\") pod \"b194d796-8bc3-4599-905b-16c28f19f7f4\" (UID: \"b194d796-8bc3-4599-905b-16c28f19f7f4\") " Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.835694 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bhhg\" (UniqueName: \"kubernetes.io/projected/a505a892-3153-4dfe-abdb-c83998d795c1-kube-api-access-2bhhg\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.835708 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a505a892-3153-4dfe-abdb-c83998d795c1-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.836292 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b194d796-8bc3-4599-905b-16c28f19f7f4-client-ca" (OuterVolumeSpecName: "client-ca") pod "b194d796-8bc3-4599-905b-16c28f19f7f4" (UID: "b194d796-8bc3-4599-905b-16c28f19f7f4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.836406 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b194d796-8bc3-4599-905b-16c28f19f7f4-config" (OuterVolumeSpecName: "config") pod "b194d796-8bc3-4599-905b-16c28f19f7f4" (UID: "b194d796-8bc3-4599-905b-16c28f19f7f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.838755 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b194d796-8bc3-4599-905b-16c28f19f7f4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b194d796-8bc3-4599-905b-16c28f19f7f4" (UID: "b194d796-8bc3-4599-905b-16c28f19f7f4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.840077 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b194d796-8bc3-4599-905b-16c28f19f7f4-kube-api-access-k4jj8" (OuterVolumeSpecName: "kube-api-access-k4jj8") pod "b194d796-8bc3-4599-905b-16c28f19f7f4" (UID: "b194d796-8bc3-4599-905b-16c28f19f7f4"). InnerVolumeSpecName "kube-api-access-k4jj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.937214 4992 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b194d796-8bc3-4599-905b-16c28f19f7f4-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.937250 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4jj8\" (UniqueName: \"kubernetes.io/projected/b194d796-8bc3-4599-905b-16c28f19f7f4-kube-api-access-k4jj8\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.937265 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b194d796-8bc3-4599-905b-16c28f19f7f4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.937277 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b194d796-8bc3-4599-905b-16c28f19f7f4-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:14 crc kubenswrapper[4992]: I0131 09:30:14.967854 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.170000 4992 generic.go:334] "Generic (PLEG): container finished" podID="a505a892-3153-4dfe-abdb-c83998d795c1" containerID="09ab3f7b82266aa4037a7eb3692ecbe518fe10564ce1ffaebbc91da904ce1ecb" exitCode=0 Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.170063 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" event={"ID":"a505a892-3153-4dfe-abdb-c83998d795c1","Type":"ContainerDied","Data":"09ab3f7b82266aa4037a7eb3692ecbe518fe10564ce1ffaebbc91da904ce1ecb"} Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.170072 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.170438 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f998f6c97-kmgqq" event={"ID":"a505a892-3153-4dfe-abdb-c83998d795c1","Type":"ContainerDied","Data":"27e84e605c81817d5fbc50663b4b3d6ff3e41da737195224c29532ec60cb9e3b"} Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.170591 4992 scope.go:117] "RemoveContainer" containerID="09ab3f7b82266aa4037a7eb3692ecbe518fe10564ce1ffaebbc91da904ce1ecb" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.172711 4992 generic.go:334] "Generic (PLEG): container finished" podID="b194d796-8bc3-4599-905b-16c28f19f7f4" containerID="a83bd5bd79bf123f7716fcdbdc52eca42f23dca88b6302dfe4fa8f9ccf996779" exitCode=0 Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.172772 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" event={"ID":"b194d796-8bc3-4599-905b-16c28f19f7f4","Type":"ContainerDied","Data":"a83bd5bd79bf123f7716fcdbdc52eca42f23dca88b6302dfe4fa8f9ccf996779"} Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.172821 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" event={"ID":"b194d796-8bc3-4599-905b-16c28f19f7f4","Type":"ContainerDied","Data":"90b61467e9ee7368035af90feb7174e4d7f59ee1394da50c84e1a601e5ae8413"} Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.172900 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.192520 4992 scope.go:117] "RemoveContainer" containerID="09ab3f7b82266aa4037a7eb3692ecbe518fe10564ce1ffaebbc91da904ce1ecb" Jan 31 09:30:15 crc kubenswrapper[4992]: E0131 09:30:15.193467 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ab3f7b82266aa4037a7eb3692ecbe518fe10564ce1ffaebbc91da904ce1ecb\": container with ID starting with 09ab3f7b82266aa4037a7eb3692ecbe518fe10564ce1ffaebbc91da904ce1ecb not found: ID does not exist" containerID="09ab3f7b82266aa4037a7eb3692ecbe518fe10564ce1ffaebbc91da904ce1ecb" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.193520 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ab3f7b82266aa4037a7eb3692ecbe518fe10564ce1ffaebbc91da904ce1ecb"} err="failed to get container status \"09ab3f7b82266aa4037a7eb3692ecbe518fe10564ce1ffaebbc91da904ce1ecb\": rpc error: code = NotFound desc = could not find container \"09ab3f7b82266aa4037a7eb3692ecbe518fe10564ce1ffaebbc91da904ce1ecb\": container with ID starting with 09ab3f7b82266aa4037a7eb3692ecbe518fe10564ce1ffaebbc91da904ce1ecb not found: ID does not exist" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.193561 4992 scope.go:117] "RemoveContainer" containerID="a83bd5bd79bf123f7716fcdbdc52eca42f23dca88b6302dfe4fa8f9ccf996779" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.212897 4992 scope.go:117] "RemoveContainer" containerID="a83bd5bd79bf123f7716fcdbdc52eca42f23dca88b6302dfe4fa8f9ccf996779" Jan 31 09:30:15 crc kubenswrapper[4992]: E0131 09:30:15.214537 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a83bd5bd79bf123f7716fcdbdc52eca42f23dca88b6302dfe4fa8f9ccf996779\": container with ID starting with a83bd5bd79bf123f7716fcdbdc52eca42f23dca88b6302dfe4fa8f9ccf996779 not found: ID does not exist" containerID="a83bd5bd79bf123f7716fcdbdc52eca42f23dca88b6302dfe4fa8f9ccf996779" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.214610 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83bd5bd79bf123f7716fcdbdc52eca42f23dca88b6302dfe4fa8f9ccf996779"} err="failed to get container status \"a83bd5bd79bf123f7716fcdbdc52eca42f23dca88b6302dfe4fa8f9ccf996779\": rpc error: code = NotFound desc = could not find container \"a83bd5bd79bf123f7716fcdbdc52eca42f23dca88b6302dfe4fa8f9ccf996779\": container with ID starting with a83bd5bd79bf123f7716fcdbdc52eca42f23dca88b6302dfe4fa8f9ccf996779 not found: ID does not exist" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.219835 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v"] Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.223381 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96785bd75-4kl7v"] Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.230950 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f998f6c97-kmgqq"] Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.233831 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5f998f6c97-kmgqq"] Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.473753 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.651287 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q"] Jan 31 09:30:15 crc kubenswrapper[4992]: E0131 09:30:15.651663 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a505a892-3153-4dfe-abdb-c83998d795c1" containerName="controller-manager" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.651686 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="a505a892-3153-4dfe-abdb-c83998d795c1" containerName="controller-manager" Jan 31 09:30:15 crc kubenswrapper[4992]: E0131 09:30:15.651714 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b194d796-8bc3-4599-905b-16c28f19f7f4" containerName="route-controller-manager" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.651728 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="b194d796-8bc3-4599-905b-16c28f19f7f4" containerName="route-controller-manager" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.651903 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="a505a892-3153-4dfe-abdb-c83998d795c1" containerName="controller-manager" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.651932 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="b194d796-8bc3-4599-905b-16c28f19f7f4" containerName="route-controller-manager" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.652561 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.656017 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.656262 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.656371 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.656415 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8"] Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.656769 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.656724 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.657363 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.664326 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.664631 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.665002 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.665599 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.665550 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.666026 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.666197 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.677920 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.685510 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q"] Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.700177 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8"] Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.748810 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-client-ca\") pod \"controller-manager-794fdf6f6b-4wtf8\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.749152 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4h98\" (UniqueName: \"kubernetes.io/projected/336fed2a-afcc-4624-bcb9-49c27c12f9f3-kube-api-access-j4h98\") pod \"controller-manager-794fdf6f6b-4wtf8\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.749395 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7d0f327-d854-4e68-ad2c-3e38949c4a34-client-ca\") pod \"route-controller-manager-7c8b474d54-h449q\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.749559 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-config\") pod \"controller-manager-794fdf6f6b-4wtf8\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.749685 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98tzm\" (UniqueName: \"kubernetes.io/projected/a7d0f327-d854-4e68-ad2c-3e38949c4a34-kube-api-access-98tzm\") pod \"route-controller-manager-7c8b474d54-h449q\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.749846 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336fed2a-afcc-4624-bcb9-49c27c12f9f3-serving-cert\") pod \"controller-manager-794fdf6f6b-4wtf8\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.749933 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-proxy-ca-bundles\") pod \"controller-manager-794fdf6f6b-4wtf8\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.749984 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7d0f327-d854-4e68-ad2c-3e38949c4a34-config\") pod \"route-controller-manager-7c8b474d54-h449q\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.750021 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7d0f327-d854-4e68-ad2c-3e38949c4a34-serving-cert\") pod \"route-controller-manager-7c8b474d54-h449q\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.851809 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7d0f327-d854-4e68-ad2c-3e38949c4a34-config\") pod \"route-controller-manager-7c8b474d54-h449q\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.852122 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7d0f327-d854-4e68-ad2c-3e38949c4a34-serving-cert\") pod \"route-controller-manager-7c8b474d54-h449q\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.852284 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-client-ca\") pod \"controller-manager-794fdf6f6b-4wtf8\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.852410 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4h98\" (UniqueName: \"kubernetes.io/projected/336fed2a-afcc-4624-bcb9-49c27c12f9f3-kube-api-access-j4h98\") pod \"controller-manager-794fdf6f6b-4wtf8\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.852705 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-config\") pod \"controller-manager-794fdf6f6b-4wtf8\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.852885 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7d0f327-d854-4e68-ad2c-3e38949c4a34-client-ca\") pod \"route-controller-manager-7c8b474d54-h449q\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.853069 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98tzm\" (UniqueName: \"kubernetes.io/projected/a7d0f327-d854-4e68-ad2c-3e38949c4a34-kube-api-access-98tzm\") pod \"route-controller-manager-7c8b474d54-h449q\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.853246 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336fed2a-afcc-4624-bcb9-49c27c12f9f3-serving-cert\") pod \"controller-manager-794fdf6f6b-4wtf8\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.853435 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-proxy-ca-bundles\") pod \"controller-manager-794fdf6f6b-4wtf8\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.854259 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7d0f327-d854-4e68-ad2c-3e38949c4a34-client-ca\") pod \"route-controller-manager-7c8b474d54-h449q\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.854381 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-client-ca\") pod \"controller-manager-794fdf6f6b-4wtf8\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.854721 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7d0f327-d854-4e68-ad2c-3e38949c4a34-config\") pod \"route-controller-manager-7c8b474d54-h449q\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.854889 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-config\") pod \"controller-manager-794fdf6f6b-4wtf8\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.856930 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-proxy-ca-bundles\") pod \"controller-manager-794fdf6f6b-4wtf8\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.867508 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336fed2a-afcc-4624-bcb9-49c27c12f9f3-serving-cert\") pod \"controller-manager-794fdf6f6b-4wtf8\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.869234 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7d0f327-d854-4e68-ad2c-3e38949c4a34-serving-cert\") pod \"route-controller-manager-7c8b474d54-h449q\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.873529 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4h98\" (UniqueName: \"kubernetes.io/projected/336fed2a-afcc-4624-bcb9-49c27c12f9f3-kube-api-access-j4h98\") pod \"controller-manager-794fdf6f6b-4wtf8\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.885620 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98tzm\" (UniqueName: \"kubernetes.io/projected/a7d0f327-d854-4e68-ad2c-3e38949c4a34-kube-api-access-98tzm\") pod \"route-controller-manager-7c8b474d54-h449q\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:15 crc kubenswrapper[4992]: I0131 09:30:15.989493 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:16 crc kubenswrapper[4992]: I0131 09:30:16.010232 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:16 crc kubenswrapper[4992]: I0131 09:30:16.279350 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8"] Jan 31 09:30:16 crc kubenswrapper[4992]: I0131 09:30:16.309891 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q"] Jan 31 09:30:16 crc kubenswrapper[4992]: W0131 09:30:16.313137 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7d0f327_d854_4e68_ad2c_3e38949c4a34.slice/crio-dcf9c803b346621be2ecb1518f83cee715d3d66f1fde99fefa372f037a7363e4 WatchSource:0}: Error finding container dcf9c803b346621be2ecb1518f83cee715d3d66f1fde99fefa372f037a7363e4: Status 404 returned error can't find the container with id dcf9c803b346621be2ecb1518f83cee715d3d66f1fde99fefa372f037a7363e4 Jan 31 09:30:16 crc kubenswrapper[4992]: I0131 09:30:16.464383 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 31 09:30:17 crc kubenswrapper[4992]: I0131 09:30:17.189007 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a505a892-3153-4dfe-abdb-c83998d795c1" path="/var/lib/kubelet/pods/a505a892-3153-4dfe-abdb-c83998d795c1/volumes" Jan 31 09:30:17 crc kubenswrapper[4992]: I0131 09:30:17.189727 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b194d796-8bc3-4599-905b-16c28f19f7f4" path="/var/lib/kubelet/pods/b194d796-8bc3-4599-905b-16c28f19f7f4/volumes" Jan 31 09:30:17 crc kubenswrapper[4992]: I0131 09:30:17.190198 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:17 crc kubenswrapper[4992]: I0131 09:30:17.190220 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" event={"ID":"a7d0f327-d854-4e68-ad2c-3e38949c4a34","Type":"ContainerStarted","Data":"d14a93fe3c774bc12cf0f4fed9d83c020d16de86ff530772db4d7cdb75e5594f"} Jan 31 09:30:17 crc kubenswrapper[4992]: I0131 09:30:17.190232 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" event={"ID":"a7d0f327-d854-4e68-ad2c-3e38949c4a34","Type":"ContainerStarted","Data":"dcf9c803b346621be2ecb1518f83cee715d3d66f1fde99fefa372f037a7363e4"} Jan 31 09:30:17 crc kubenswrapper[4992]: I0131 09:30:17.190242 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" event={"ID":"336fed2a-afcc-4624-bcb9-49c27c12f9f3","Type":"ContainerStarted","Data":"7a2971165886bf8c8533284f55c8e51054e5549d255620da7222788ce1d049da"} Jan 31 09:30:17 crc kubenswrapper[4992]: I0131 09:30:17.190252 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:17 crc kubenswrapper[4992]: I0131 09:30:17.190260 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" event={"ID":"336fed2a-afcc-4624-bcb9-49c27c12f9f3","Type":"ContainerStarted","Data":"14aa488a724d7a5ec8c326ed68d2f8d9f7d0396a27066c54c0e3f8e9f8adb5f9"} Jan 31 09:30:17 crc kubenswrapper[4992]: I0131 09:30:17.194694 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:17 crc kubenswrapper[4992]: I0131 09:30:17.195321 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:17 crc kubenswrapper[4992]: I0131 09:30:17.208663 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" podStartSLOduration=3.20864309 podStartE2EDuration="3.20864309s" podCreationTimestamp="2026-01-31 09:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:30:17.20565144 +0000 UTC m=+313.177043437" watchObservedRunningTime="2026-01-31 09:30:17.20864309 +0000 UTC m=+313.180035077" Jan 31 09:30:17 crc kubenswrapper[4992]: I0131 09:30:17.270309 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" podStartSLOduration=3.270281264 podStartE2EDuration="3.270281264s" podCreationTimestamp="2026-01-31 09:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:30:17.269193551 +0000 UTC m=+313.240585558" watchObservedRunningTime="2026-01-31 09:30:17.270281264 +0000 UTC m=+313.241673281" Jan 31 09:30:17 crc kubenswrapper[4992]: I0131 09:30:17.342927 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 31 09:30:20 crc kubenswrapper[4992]: I0131 09:30:20.540828 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 31 09:30:21 crc kubenswrapper[4992]: I0131 09:30:21.925967 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 31 09:30:23 crc kubenswrapper[4992]: I0131 09:30:23.646258 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 31 09:30:25 crc kubenswrapper[4992]: I0131 09:30:25.533072 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 31 09:30:27 crc kubenswrapper[4992]: I0131 09:30:27.539636 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 31 09:30:29 crc kubenswrapper[4992]: I0131 09:30:29.071256 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 31 09:30:29 crc kubenswrapper[4992]: I0131 09:30:29.289972 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 31 09:30:30 crc kubenswrapper[4992]: I0131 09:30:30.910928 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 31 09:30:32 crc kubenswrapper[4992]: I0131 09:30:32.433994 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 31 09:30:34 crc kubenswrapper[4992]: I0131 09:30:34.233142 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8"] Jan 31 09:30:34 crc kubenswrapper[4992]: I0131 09:30:34.233700 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" podUID="336fed2a-afcc-4624-bcb9-49c27c12f9f3" containerName="controller-manager" containerID="cri-o://7a2971165886bf8c8533284f55c8e51054e5549d255620da7222788ce1d049da" gracePeriod=30 Jan 31 09:30:34 crc kubenswrapper[4992]: I0131 09:30:34.249948 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q"] Jan 31 09:30:34 crc kubenswrapper[4992]: I0131 09:30:34.250240 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" podUID="a7d0f327-d854-4e68-ad2c-3e38949c4a34" containerName="route-controller-manager" containerID="cri-o://d14a93fe3c774bc12cf0f4fed9d83c020d16de86ff530772db4d7cdb75e5594f" gracePeriod=30 Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.267229 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.292107 4992 generic.go:334] "Generic (PLEG): container finished" podID="336fed2a-afcc-4624-bcb9-49c27c12f9f3" containerID="7a2971165886bf8c8533284f55c8e51054e5549d255620da7222788ce1d049da" exitCode=0 Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.292203 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" event={"ID":"336fed2a-afcc-4624-bcb9-49c27c12f9f3","Type":"ContainerDied","Data":"7a2971165886bf8c8533284f55c8e51054e5549d255620da7222788ce1d049da"} Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.294339 4992 generic.go:334] "Generic (PLEG): container finished" podID="a7d0f327-d854-4e68-ad2c-3e38949c4a34" containerID="d14a93fe3c774bc12cf0f4fed9d83c020d16de86ff530772db4d7cdb75e5594f" exitCode=0 Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.294377 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" event={"ID":"a7d0f327-d854-4e68-ad2c-3e38949c4a34","Type":"ContainerDied","Data":"d14a93fe3c774bc12cf0f4fed9d83c020d16de86ff530772db4d7cdb75e5594f"} Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.294404 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" event={"ID":"a7d0f327-d854-4e68-ad2c-3e38949c4a34","Type":"ContainerDied","Data":"dcf9c803b346621be2ecb1518f83cee715d3d66f1fde99fefa372f037a7363e4"} Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.294449 4992 scope.go:117] "RemoveContainer" containerID="d14a93fe3c774bc12cf0f4fed9d83c020d16de86ff530772db4d7cdb75e5594f" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.294449 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.303162 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s"] Jan 31 09:30:35 crc kubenswrapper[4992]: E0131 09:30:35.303459 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d0f327-d854-4e68-ad2c-3e38949c4a34" containerName="route-controller-manager" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.303478 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d0f327-d854-4e68-ad2c-3e38949c4a34" containerName="route-controller-manager" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.303614 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d0f327-d854-4e68-ad2c-3e38949c4a34" containerName="route-controller-manager" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.304099 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.316482 4992 scope.go:117] "RemoveContainer" containerID="d14a93fe3c774bc12cf0f4fed9d83c020d16de86ff530772db4d7cdb75e5594f" Jan 31 09:30:35 crc kubenswrapper[4992]: E0131 09:30:35.318001 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d14a93fe3c774bc12cf0f4fed9d83c020d16de86ff530772db4d7cdb75e5594f\": container with ID starting with d14a93fe3c774bc12cf0f4fed9d83c020d16de86ff530772db4d7cdb75e5594f not found: ID does not exist" containerID="d14a93fe3c774bc12cf0f4fed9d83c020d16de86ff530772db4d7cdb75e5594f" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.318059 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14a93fe3c774bc12cf0f4fed9d83c020d16de86ff530772db4d7cdb75e5594f"} err="failed to get container status \"d14a93fe3c774bc12cf0f4fed9d83c020d16de86ff530772db4d7cdb75e5594f\": rpc error: code = NotFound desc = could not find container \"d14a93fe3c774bc12cf0f4fed9d83c020d16de86ff530772db4d7cdb75e5594f\": container with ID starting with d14a93fe3c774bc12cf0f4fed9d83c020d16de86ff530772db4d7cdb75e5594f not found: ID does not exist" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.323044 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s"] Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.358748 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7d0f327-d854-4e68-ad2c-3e38949c4a34-client-ca\") pod \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.358842 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98tzm\" (UniqueName: \"kubernetes.io/projected/a7d0f327-d854-4e68-ad2c-3e38949c4a34-kube-api-access-98tzm\") pod \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.358913 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7d0f327-d854-4e68-ad2c-3e38949c4a34-config\") pod \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.358989 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7d0f327-d854-4e68-ad2c-3e38949c4a34-serving-cert\") pod \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\" (UID: \"a7d0f327-d854-4e68-ad2c-3e38949c4a34\") " Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.359172 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-serving-cert\") pod \"route-controller-manager-574484fcb5-lhc8s\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.359236 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-client-ca\") pod \"route-controller-manager-574484fcb5-lhc8s\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.359285 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-config\") pod \"route-controller-manager-574484fcb5-lhc8s\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.359306 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f78ld\" (UniqueName: \"kubernetes.io/projected/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-kube-api-access-f78ld\") pod \"route-controller-manager-574484fcb5-lhc8s\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.359814 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d0f327-d854-4e68-ad2c-3e38949c4a34-config" (OuterVolumeSpecName: "config") pod "a7d0f327-d854-4e68-ad2c-3e38949c4a34" (UID: "a7d0f327-d854-4e68-ad2c-3e38949c4a34"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.360012 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d0f327-d854-4e68-ad2c-3e38949c4a34-client-ca" (OuterVolumeSpecName: "client-ca") pod "a7d0f327-d854-4e68-ad2c-3e38949c4a34" (UID: "a7d0f327-d854-4e68-ad2c-3e38949c4a34"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.363683 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7d0f327-d854-4e68-ad2c-3e38949c4a34-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a7d0f327-d854-4e68-ad2c-3e38949c4a34" (UID: "a7d0f327-d854-4e68-ad2c-3e38949c4a34"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.363852 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d0f327-d854-4e68-ad2c-3e38949c4a34-kube-api-access-98tzm" (OuterVolumeSpecName: "kube-api-access-98tzm") pod "a7d0f327-d854-4e68-ad2c-3e38949c4a34" (UID: "a7d0f327-d854-4e68-ad2c-3e38949c4a34"). InnerVolumeSpecName "kube-api-access-98tzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.449629 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.460756 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-config\") pod \"route-controller-manager-574484fcb5-lhc8s\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.460800 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f78ld\" (UniqueName: \"kubernetes.io/projected/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-kube-api-access-f78ld\") pod \"route-controller-manager-574484fcb5-lhc8s\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.460870 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-serving-cert\") pod \"route-controller-manager-574484fcb5-lhc8s\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.460917 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-client-ca\") pod \"route-controller-manager-574484fcb5-lhc8s\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.460967 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7d0f327-d854-4e68-ad2c-3e38949c4a34-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.460983 4992 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a7d0f327-d854-4e68-ad2c-3e38949c4a34-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.460994 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98tzm\" (UniqueName: \"kubernetes.io/projected/a7d0f327-d854-4e68-ad2c-3e38949c4a34-kube-api-access-98tzm\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.461007 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7d0f327-d854-4e68-ad2c-3e38949c4a34-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.462098 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-config\") pod \"route-controller-manager-574484fcb5-lhc8s\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.462559 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-client-ca\") pod \"route-controller-manager-574484fcb5-lhc8s\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.466066 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-serving-cert\") pod \"route-controller-manager-574484fcb5-lhc8s\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.479766 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f78ld\" (UniqueName: \"kubernetes.io/projected/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-kube-api-access-f78ld\") pod \"route-controller-manager-574484fcb5-lhc8s\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.561241 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-proxy-ca-bundles\") pod \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.561315 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4h98\" (UniqueName: \"kubernetes.io/projected/336fed2a-afcc-4624-bcb9-49c27c12f9f3-kube-api-access-j4h98\") pod \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.561452 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-config\") pod \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.561496 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336fed2a-afcc-4624-bcb9-49c27c12f9f3-serving-cert\") pod \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.561547 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-client-ca\") pod \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\" (UID: \"336fed2a-afcc-4624-bcb9-49c27c12f9f3\") " Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.562611 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "336fed2a-afcc-4624-bcb9-49c27c12f9f3" (UID: "336fed2a-afcc-4624-bcb9-49c27c12f9f3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.562621 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-config" (OuterVolumeSpecName: "config") pod "336fed2a-afcc-4624-bcb9-49c27c12f9f3" (UID: "336fed2a-afcc-4624-bcb9-49c27c12f9f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.562658 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-client-ca" (OuterVolumeSpecName: "client-ca") pod "336fed2a-afcc-4624-bcb9-49c27c12f9f3" (UID: "336fed2a-afcc-4624-bcb9-49c27c12f9f3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.566146 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/336fed2a-afcc-4624-bcb9-49c27c12f9f3-kube-api-access-j4h98" (OuterVolumeSpecName: "kube-api-access-j4h98") pod "336fed2a-afcc-4624-bcb9-49c27c12f9f3" (UID: "336fed2a-afcc-4624-bcb9-49c27c12f9f3"). InnerVolumeSpecName "kube-api-access-j4h98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.566841 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/336fed2a-afcc-4624-bcb9-49c27c12f9f3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "336fed2a-afcc-4624-bcb9-49c27c12f9f3" (UID: "336fed2a-afcc-4624-bcb9-49c27c12f9f3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.622996 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.635589 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q"] Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.642327 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8b474d54-h449q"] Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.663622 4992 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.663666 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4h98\" (UniqueName: \"kubernetes.io/projected/336fed2a-afcc-4624-bcb9-49c27c12f9f3-kube-api-access-j4h98\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.663679 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.663693 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/336fed2a-afcc-4624-bcb9-49c27c12f9f3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.663704 4992 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/336fed2a-afcc-4624-bcb9-49c27c12f9f3-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:35 crc kubenswrapper[4992]: I0131 09:30:35.890478 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s"] Jan 31 09:30:35 crc kubenswrapper[4992]: W0131 09:30:35.894404 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8a92b6f_2fa6_4282_a3d7_1c1ee3cd4511.slice/crio-d498c88a48b805ee033e06a9c99b4e3a8e37e42b88609da07a2b08eaeefc6bcc WatchSource:0}: Error finding container d498c88a48b805ee033e06a9c99b4e3a8e37e42b88609da07a2b08eaeefc6bcc: Status 404 returned error can't find the container with id d498c88a48b805ee033e06a9c99b4e3a8e37e42b88609da07a2b08eaeefc6bcc Jan 31 09:30:36 crc kubenswrapper[4992]: I0131 09:30:36.088406 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 31 09:30:36 crc kubenswrapper[4992]: I0131 09:30:36.292484 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 31 09:30:36 crc kubenswrapper[4992]: I0131 09:30:36.300879 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" event={"ID":"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511","Type":"ContainerStarted","Data":"9feea63338a9285c633d56310d482efa082d826c8957b1a0482c4bfd951a2cf0"} Jan 31 09:30:36 crc kubenswrapper[4992]: I0131 09:30:36.300925 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" event={"ID":"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511","Type":"ContainerStarted","Data":"d498c88a48b805ee033e06a9c99b4e3a8e37e42b88609da07a2b08eaeefc6bcc"} Jan 31 09:30:36 crc kubenswrapper[4992]: I0131 09:30:36.300945 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:36 crc kubenswrapper[4992]: I0131 09:30:36.303450 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" event={"ID":"336fed2a-afcc-4624-bcb9-49c27c12f9f3","Type":"ContainerDied","Data":"14aa488a724d7a5ec8c326ed68d2f8d9f7d0396a27066c54c0e3f8e9f8adb5f9"} Jan 31 09:30:36 crc kubenswrapper[4992]: I0131 09:30:36.303500 4992 scope.go:117] "RemoveContainer" containerID="7a2971165886bf8c8533284f55c8e51054e5549d255620da7222788ce1d049da" Jan 31 09:30:36 crc kubenswrapper[4992]: I0131 09:30:36.303588 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8" Jan 31 09:30:36 crc kubenswrapper[4992]: I0131 09:30:36.337906 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" podStartSLOduration=2.337880865 podStartE2EDuration="2.337880865s" podCreationTimestamp="2026-01-31 09:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:30:36.334639907 +0000 UTC m=+332.306031944" watchObservedRunningTime="2026-01-31 09:30:36.337880865 +0000 UTC m=+332.309272862" Jan 31 09:30:36 crc kubenswrapper[4992]: I0131 09:30:36.352664 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8"] Jan 31 09:30:36 crc kubenswrapper[4992]: I0131 09:30:36.357087 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-794fdf6f6b-4wtf8"] Jan 31 09:30:36 crc kubenswrapper[4992]: I0131 09:30:36.490011 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.189287 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="336fed2a-afcc-4624-bcb9-49c27c12f9f3" path="/var/lib/kubelet/pods/336fed2a-afcc-4624-bcb9-49c27c12f9f3/volumes" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.190549 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d0f327-d854-4e68-ad2c-3e38949c4a34" path="/var/lib/kubelet/pods/a7d0f327-d854-4e68-ad2c-3e38949c4a34/volumes" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.664283 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4"] Jan 31 09:30:37 crc kubenswrapper[4992]: E0131 09:30:37.664516 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336fed2a-afcc-4624-bcb9-49c27c12f9f3" containerName="controller-manager" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.664528 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="336fed2a-afcc-4624-bcb9-49c27c12f9f3" containerName="controller-manager" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.664650 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="336fed2a-afcc-4624-bcb9-49c27c12f9f3" containerName="controller-manager" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.665024 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.666633 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.666712 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.666976 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.667032 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.667110 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.668408 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.672982 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.674844 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4"] Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.693126 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-serving-cert\") pod \"controller-manager-6dd6bcf659-zmwh4\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.693259 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-proxy-ca-bundles\") pod \"controller-manager-6dd6bcf659-zmwh4\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.693292 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzr7p\" (UniqueName: \"kubernetes.io/projected/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-kube-api-access-zzr7p\") pod \"controller-manager-6dd6bcf659-zmwh4\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.693322 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-config\") pod \"controller-manager-6dd6bcf659-zmwh4\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.693346 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-client-ca\") pod \"controller-manager-6dd6bcf659-zmwh4\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.794788 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-client-ca\") pod \"controller-manager-6dd6bcf659-zmwh4\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.794858 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-serving-cert\") pod \"controller-manager-6dd6bcf659-zmwh4\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.794916 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-proxy-ca-bundles\") pod \"controller-manager-6dd6bcf659-zmwh4\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.794938 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzr7p\" (UniqueName: \"kubernetes.io/projected/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-kube-api-access-zzr7p\") pod \"controller-manager-6dd6bcf659-zmwh4\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.794959 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-config\") pod \"controller-manager-6dd6bcf659-zmwh4\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.796057 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-client-ca\") pod \"controller-manager-6dd6bcf659-zmwh4\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.796266 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-config\") pod \"controller-manager-6dd6bcf659-zmwh4\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.797755 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-proxy-ca-bundles\") pod \"controller-manager-6dd6bcf659-zmwh4\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.801479 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-serving-cert\") pod \"controller-manager-6dd6bcf659-zmwh4\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.815954 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzr7p\" (UniqueName: \"kubernetes.io/projected/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-kube-api-access-zzr7p\") pod \"controller-manager-6dd6bcf659-zmwh4\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:37 crc kubenswrapper[4992]: I0131 09:30:37.980012 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:38 crc kubenswrapper[4992]: I0131 09:30:38.265680 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4"] Jan 31 09:30:38 crc kubenswrapper[4992]: W0131 09:30:38.280619 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec52ce03_f6d0_4484_ac27_c8f614f3d7f0.slice/crio-4d4307176bd850883a9f015937cf09735ffa959a4354edbb74fe5eb6e59a0b09 WatchSource:0}: Error finding container 4d4307176bd850883a9f015937cf09735ffa959a4354edbb74fe5eb6e59a0b09: Status 404 returned error can't find the container with id 4d4307176bd850883a9f015937cf09735ffa959a4354edbb74fe5eb6e59a0b09 Jan 31 09:30:38 crc kubenswrapper[4992]: I0131 09:30:38.280750 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 31 09:30:38 crc kubenswrapper[4992]: I0131 09:30:38.315403 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" event={"ID":"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0","Type":"ContainerStarted","Data":"4d4307176bd850883a9f015937cf09735ffa959a4354edbb74fe5eb6e59a0b09"} Jan 31 09:30:39 crc kubenswrapper[4992]: I0131 09:30:39.324083 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" event={"ID":"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0","Type":"ContainerStarted","Data":"035d8dedf3a594b229d8883b5b61a214246e34e69afbc2ac8dd0bdf4a031043e"} Jan 31 09:30:39 crc kubenswrapper[4992]: I0131 09:30:39.324583 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:39 crc kubenswrapper[4992]: I0131 09:30:39.334656 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:39 crc kubenswrapper[4992]: I0131 09:30:39.349438 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" podStartSLOduration=5.34939525 podStartE2EDuration="5.34939525s" podCreationTimestamp="2026-01-31 09:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:30:39.347135852 +0000 UTC m=+335.318527879" watchObservedRunningTime="2026-01-31 09:30:39.34939525 +0000 UTC m=+335.320787257" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.205820 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4"] Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.206595 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" podUID="ec52ce03-f6d0-4484-ac27-c8f614f3d7f0" containerName="controller-manager" containerID="cri-o://035d8dedf3a594b229d8883b5b61a214246e34e69afbc2ac8dd0bdf4a031043e" gracePeriod=30 Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.306679 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s"] Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.307270 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" podUID="a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511" containerName="route-controller-manager" containerID="cri-o://9feea63338a9285c633d56310d482efa082d826c8957b1a0482c4bfd951a2cf0" gracePeriod=30 Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.414457 4992 generic.go:334] "Generic (PLEG): container finished" podID="ec52ce03-f6d0-4484-ac27-c8f614f3d7f0" containerID="035d8dedf3a594b229d8883b5b61a214246e34e69afbc2ac8dd0bdf4a031043e" exitCode=0 Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.414505 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" event={"ID":"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0","Type":"ContainerDied","Data":"035d8dedf3a594b229d8883b5b61a214246e34e69afbc2ac8dd0bdf4a031043e"} Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.796874 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.801264 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.818466 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzr7p\" (UniqueName: \"kubernetes.io/projected/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-kube-api-access-zzr7p\") pod \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.818506 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-client-ca\") pod \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.818531 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-serving-cert\") pod \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.818567 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-config\") pod \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.818616 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-client-ca\") pod \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.818642 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-config\") pod \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.818677 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-serving-cert\") pod \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.818695 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-proxy-ca-bundles\") pod \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\" (UID: \"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0\") " Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.818715 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f78ld\" (UniqueName: \"kubernetes.io/projected/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-kube-api-access-f78ld\") pod \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\" (UID: \"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511\") " Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.819368 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-client-ca" (OuterVolumeSpecName: "client-ca") pod "ec52ce03-f6d0-4484-ac27-c8f614f3d7f0" (UID: "ec52ce03-f6d0-4484-ac27-c8f614f3d7f0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.819397 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-config" (OuterVolumeSpecName: "config") pod "a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511" (UID: "a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.819494 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-client-ca" (OuterVolumeSpecName: "client-ca") pod "a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511" (UID: "a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.819741 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ec52ce03-f6d0-4484-ac27-c8f614f3d7f0" (UID: "ec52ce03-f6d0-4484-ac27-c8f614f3d7f0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.819823 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-config" (OuterVolumeSpecName: "config") pod "ec52ce03-f6d0-4484-ac27-c8f614f3d7f0" (UID: "ec52ce03-f6d0-4484-ac27-c8f614f3d7f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.825249 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511" (UID: "a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.825643 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-kube-api-access-zzr7p" (OuterVolumeSpecName: "kube-api-access-zzr7p") pod "ec52ce03-f6d0-4484-ac27-c8f614f3d7f0" (UID: "ec52ce03-f6d0-4484-ac27-c8f614f3d7f0"). InnerVolumeSpecName "kube-api-access-zzr7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.826933 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-kube-api-access-f78ld" (OuterVolumeSpecName: "kube-api-access-f78ld") pod "a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511" (UID: "a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511"). InnerVolumeSpecName "kube-api-access-f78ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.827539 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ec52ce03-f6d0-4484-ac27-c8f614f3d7f0" (UID: "ec52ce03-f6d0-4484-ac27-c8f614f3d7f0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.919559 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f78ld\" (UniqueName: \"kubernetes.io/projected/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-kube-api-access-f78ld\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.920057 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzr7p\" (UniqueName: \"kubernetes.io/projected/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-kube-api-access-zzr7p\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.920077 4992 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.920087 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.920096 4992 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.920104 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.920112 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.920119 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:54 crc kubenswrapper[4992]: I0131 09:30:54.920127 4992 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.423209 4992 generic.go:334] "Generic (PLEG): container finished" podID="a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511" containerID="9feea63338a9285c633d56310d482efa082d826c8957b1a0482c4bfd951a2cf0" exitCode=0 Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.423264 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" event={"ID":"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511","Type":"ContainerDied","Data":"9feea63338a9285c633d56310d482efa082d826c8957b1a0482c4bfd951a2cf0"} Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.423289 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" event={"ID":"a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511","Type":"ContainerDied","Data":"d498c88a48b805ee033e06a9c99b4e3a8e37e42b88609da07a2b08eaeefc6bcc"} Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.423305 4992 scope.go:117] "RemoveContainer" containerID="9feea63338a9285c633d56310d482efa082d826c8957b1a0482c4bfd951a2cf0" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.423405 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.427187 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" event={"ID":"ec52ce03-f6d0-4484-ac27-c8f614f3d7f0","Type":"ContainerDied","Data":"4d4307176bd850883a9f015937cf09735ffa959a4354edbb74fe5eb6e59a0b09"} Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.428238 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.450868 4992 scope.go:117] "RemoveContainer" containerID="9feea63338a9285c633d56310d482efa082d826c8957b1a0482c4bfd951a2cf0" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.450918 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s"] Jan 31 09:30:55 crc kubenswrapper[4992]: E0131 09:30:55.451679 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9feea63338a9285c633d56310d482efa082d826c8957b1a0482c4bfd951a2cf0\": container with ID starting with 9feea63338a9285c633d56310d482efa082d826c8957b1a0482c4bfd951a2cf0 not found: ID does not exist" containerID="9feea63338a9285c633d56310d482efa082d826c8957b1a0482c4bfd951a2cf0" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.451845 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9feea63338a9285c633d56310d482efa082d826c8957b1a0482c4bfd951a2cf0"} err="failed to get container status \"9feea63338a9285c633d56310d482efa082d826c8957b1a0482c4bfd951a2cf0\": rpc error: code = NotFound desc = could not find container \"9feea63338a9285c633d56310d482efa082d826c8957b1a0482c4bfd951a2cf0\": container with ID starting with 9feea63338a9285c633d56310d482efa082d826c8957b1a0482c4bfd951a2cf0 not found: ID does not exist" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.451874 4992 scope.go:117] "RemoveContainer" containerID="035d8dedf3a594b229d8883b5b61a214246e34e69afbc2ac8dd0bdf4a031043e" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.458595 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574484fcb5-lhc8s"] Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.462989 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4"] Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.467132 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6dd6bcf659-zmwh4"] Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.681300 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b69b979b6-sqh5d"] Jan 31 09:30:55 crc kubenswrapper[4992]: E0131 09:30:55.681952 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec52ce03-f6d0-4484-ac27-c8f614f3d7f0" containerName="controller-manager" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.681971 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec52ce03-f6d0-4484-ac27-c8f614f3d7f0" containerName="controller-manager" Jan 31 09:30:55 crc kubenswrapper[4992]: E0131 09:30:55.681985 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511" containerName="route-controller-manager" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.681994 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511" containerName="route-controller-manager" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.682158 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec52ce03-f6d0-4484-ac27-c8f614f3d7f0" containerName="controller-manager" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.682201 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511" containerName="route-controller-manager" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.683074 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.687227 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.687755 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.689022 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.689158 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.689249 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.689539 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.698216 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.698398 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w"] Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.699299 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.700552 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.704435 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.705657 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.706059 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.706682 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.707121 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.707372 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b69b979b6-sqh5d"] Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.709040 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w"] Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.831708 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsfl6\" (UniqueName: \"kubernetes.io/projected/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-kube-api-access-zsfl6\") pod \"route-controller-manager-b7c494b96-x7v4w\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.831813 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-client-ca\") pod \"controller-manager-b69b979b6-sqh5d\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.831852 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-config\") pod \"controller-manager-b69b979b6-sqh5d\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.831897 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgkss\" (UniqueName: \"kubernetes.io/projected/919d419e-3b0e-4133-87ec-4dca2cd10320-kube-api-access-bgkss\") pod \"controller-manager-b69b979b6-sqh5d\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.831944 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-proxy-ca-bundles\") pod \"controller-manager-b69b979b6-sqh5d\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.831986 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-serving-cert\") pod \"route-controller-manager-b7c494b96-x7v4w\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.832034 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-config\") pod \"route-controller-manager-b7c494b96-x7v4w\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.832081 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/919d419e-3b0e-4133-87ec-4dca2cd10320-serving-cert\") pod \"controller-manager-b69b979b6-sqh5d\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.832116 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-client-ca\") pod \"route-controller-manager-b7c494b96-x7v4w\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.933594 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-serving-cert\") pod \"route-controller-manager-b7c494b96-x7v4w\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.933646 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-config\") pod \"route-controller-manager-b7c494b96-x7v4w\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.933675 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/919d419e-3b0e-4133-87ec-4dca2cd10320-serving-cert\") pod \"controller-manager-b69b979b6-sqh5d\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.933693 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-client-ca\") pod \"route-controller-manager-b7c494b96-x7v4w\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.933714 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsfl6\" (UniqueName: \"kubernetes.io/projected/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-kube-api-access-zsfl6\") pod \"route-controller-manager-b7c494b96-x7v4w\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.933740 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-client-ca\") pod \"controller-manager-b69b979b6-sqh5d\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.933757 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-config\") pod \"controller-manager-b69b979b6-sqh5d\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.933781 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgkss\" (UniqueName: \"kubernetes.io/projected/919d419e-3b0e-4133-87ec-4dca2cd10320-kube-api-access-bgkss\") pod \"controller-manager-b69b979b6-sqh5d\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.933802 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-proxy-ca-bundles\") pod \"controller-manager-b69b979b6-sqh5d\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.934837 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-client-ca\") pod \"route-controller-manager-b7c494b96-x7v4w\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.935113 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-config\") pod \"route-controller-manager-b7c494b96-x7v4w\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.935262 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-proxy-ca-bundles\") pod \"controller-manager-b69b979b6-sqh5d\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.935462 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-config\") pod \"controller-manager-b69b979b6-sqh5d\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.936345 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-client-ca\") pod \"controller-manager-b69b979b6-sqh5d\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.937685 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-serving-cert\") pod \"route-controller-manager-b7c494b96-x7v4w\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.939803 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/919d419e-3b0e-4133-87ec-4dca2cd10320-serving-cert\") pod \"controller-manager-b69b979b6-sqh5d\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.949834 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsfl6\" (UniqueName: \"kubernetes.io/projected/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-kube-api-access-zsfl6\") pod \"route-controller-manager-b7c494b96-x7v4w\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:55 crc kubenswrapper[4992]: I0131 09:30:55.950044 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgkss\" (UniqueName: \"kubernetes.io/projected/919d419e-3b0e-4133-87ec-4dca2cd10320-kube-api-access-bgkss\") pod \"controller-manager-b69b979b6-sqh5d\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:56 crc kubenswrapper[4992]: I0131 09:30:56.036800 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:56 crc kubenswrapper[4992]: I0131 09:30:56.044389 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:56 crc kubenswrapper[4992]: I0131 09:30:56.225720 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b69b979b6-sqh5d"] Jan 31 09:30:56 crc kubenswrapper[4992]: I0131 09:30:56.440326 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" event={"ID":"919d419e-3b0e-4133-87ec-4dca2cd10320","Type":"ContainerStarted","Data":"3cc27a170995f3dc8f1bddee03000e80892b6130660a0d686667b86c0e998b85"} Jan 31 09:30:56 crc kubenswrapper[4992]: I0131 09:30:56.440377 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" event={"ID":"919d419e-3b0e-4133-87ec-4dca2cd10320","Type":"ContainerStarted","Data":"a996948bb52aeee4241f739b19b607975ab5d800569aa5a5c6deb622e8f536be"} Jan 31 09:30:56 crc kubenswrapper[4992]: I0131 09:30:56.440679 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:56 crc kubenswrapper[4992]: I0131 09:30:56.445431 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:30:56 crc kubenswrapper[4992]: I0131 09:30:56.457414 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" podStartSLOduration=2.457393645 podStartE2EDuration="2.457393645s" podCreationTimestamp="2026-01-31 09:30:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:30:56.456072515 +0000 UTC m=+352.427464522" watchObservedRunningTime="2026-01-31 09:30:56.457393645 +0000 UTC m=+352.428785632" Jan 31 09:30:56 crc kubenswrapper[4992]: I0131 09:30:56.494938 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w"] Jan 31 09:30:57 crc kubenswrapper[4992]: I0131 09:30:57.188798 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511" path="/var/lib/kubelet/pods/a8a92b6f-2fa6-4282-a3d7-1c1ee3cd4511/volumes" Jan 31 09:30:57 crc kubenswrapper[4992]: I0131 09:30:57.189627 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec52ce03-f6d0-4484-ac27-c8f614f3d7f0" path="/var/lib/kubelet/pods/ec52ce03-f6d0-4484-ac27-c8f614f3d7f0/volumes" Jan 31 09:30:57 crc kubenswrapper[4992]: I0131 09:30:57.445837 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" event={"ID":"de87c20e-fabd-4c43-a3d3-9b401a1db0e6","Type":"ContainerStarted","Data":"0500de181d443cd42892d83c22098f8bf8dd9647940956518acbcab628d4d494"} Jan 31 09:30:57 crc kubenswrapper[4992]: I0131 09:30:57.445885 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" event={"ID":"de87c20e-fabd-4c43-a3d3-9b401a1db0e6","Type":"ContainerStarted","Data":"a6da05283debcbb0cc4e7f363ecc86bf3afb3d0011037faed5cf116e31bb4394"} Jan 31 09:30:57 crc kubenswrapper[4992]: I0131 09:30:57.446148 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:57 crc kubenswrapper[4992]: I0131 09:30:57.451731 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:30:57 crc kubenswrapper[4992]: I0131 09:30:57.460603 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" podStartSLOduration=3.460581177 podStartE2EDuration="3.460581177s" podCreationTimestamp="2026-01-31 09:30:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:30:57.459615488 +0000 UTC m=+353.431007495" watchObservedRunningTime="2026-01-31 09:30:57.460581177 +0000 UTC m=+353.431973164" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.336142 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8qcdm"] Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.337502 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.351916 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8qcdm"] Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.353290 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-registry-certificates\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.353356 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.353484 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.353521 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-trusted-ca\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.353558 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.353583 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jrr7\" (UniqueName: \"kubernetes.io/projected/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-kube-api-access-9jrr7\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.353618 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-bound-sa-token\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.353649 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-registry-tls\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.416313 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.454928 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.454996 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-trusted-ca\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.455033 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.455057 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jrr7\" (UniqueName: \"kubernetes.io/projected/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-kube-api-access-9jrr7\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.455103 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-bound-sa-token\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.455136 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-registry-tls\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.455164 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-registry-certificates\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.455510 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.456132 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-trusted-ca\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.456481 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-registry-certificates\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.464992 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.465543 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-registry-tls\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.478809 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jrr7\" (UniqueName: \"kubernetes.io/projected/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-kube-api-access-9jrr7\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.479084 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec66848a-526d-4c24-a2f8-cbbeb28d9bb3-bound-sa-token\") pod \"image-registry-66df7c8f76-8qcdm\" (UID: \"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3\") " pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:04 crc kubenswrapper[4992]: I0131 09:31:04.664075 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:05 crc kubenswrapper[4992]: I0131 09:31:05.079200 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8qcdm"] Jan 31 09:31:05 crc kubenswrapper[4992]: I0131 09:31:05.492498 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" event={"ID":"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3","Type":"ContainerStarted","Data":"f7a42643c6ca7eeeea945d392239d5dcc6f30563482809d1689d53bf42bdf2fa"} Jan 31 09:31:05 crc kubenswrapper[4992]: I0131 09:31:05.492848 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" event={"ID":"ec66848a-526d-4c24-a2f8-cbbeb28d9bb3","Type":"ContainerStarted","Data":"211535c10a0768c7510bd4c7820c6101b4f1db6781dfbd8bbe78ff22ca896da0"} Jan 31 09:31:05 crc kubenswrapper[4992]: I0131 09:31:05.492871 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:05 crc kubenswrapper[4992]: I0131 09:31:05.516022 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" podStartSLOduration=1.516007294 podStartE2EDuration="1.516007294s" podCreationTimestamp="2026-01-31 09:31:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:31:05.515188909 +0000 UTC m=+361.486580906" watchObservedRunningTime="2026-01-31 09:31:05.516007294 +0000 UTC m=+361.487399281" Jan 31 09:31:15 crc kubenswrapper[4992]: I0131 09:31:15.300763 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:31:15 crc kubenswrapper[4992]: I0131 09:31:15.301186 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:31:24 crc kubenswrapper[4992]: I0131 09:31:24.668993 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8qcdm" Jan 31 09:31:24 crc kubenswrapper[4992]: I0131 09:31:24.724409 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-j6dj7"] Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.059567 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4drj6"] Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.060477 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4drj6" podUID="7c4d6b90-976c-46f4-b55f-26d3277cc754" containerName="registry-server" containerID="cri-o://94afe569ea782071a7e8325cb527ab1edeb13d943b10205b7ea648d164ca4b31" gracePeriod=30 Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.066647 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b7j8l"] Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.066909 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b7j8l" podUID="533c10ab-faa7-4a62-8e8a-2ebd87578ced" containerName="registry-server" containerID="cri-o://008c7058149d437aceeb68d97a9b39a578ca8d93a8fc56a613e00c082b93d7e6" gracePeriod=30 Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.085149 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r68rm"] Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.085758 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" podUID="42351b07-cf74-49fd-b6fd-88b7ef8fdac0" containerName="marketplace-operator" containerID="cri-o://b768c51c7370bf0a5e2863e9d4fe57a03fab7b1bcfb19617106bd81eb457d9f1" gracePeriod=30 Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.095210 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qtkp"] Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.095523 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2qtkp" podUID="2cef084e-8345-4b18-ade1-4cc6e9fbfd09" containerName="registry-server" containerID="cri-o://22291ad67814e70eb10003c0326e63a44ad4978e481eb5ebba9dc6877389b215" gracePeriod=30 Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.102572 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4rg97"] Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.103707 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.105984 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9xmbj"] Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.106193 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9xmbj" podUID="de317207-ef37-4247-9d1d-279570141ebc" containerName="registry-server" containerID="cri-o://21d7c088f7dec369814d7bf5a663c8b9dc81e84a1c87207d791830d4bbf75e3b" gracePeriod=30 Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.119187 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4rg97"] Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.241114 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7fg4\" (UniqueName: \"kubernetes.io/projected/854b03be-b8cf-4b7a-91bd-9a7c0c8342ca-kube-api-access-z7fg4\") pod \"marketplace-operator-79b997595-4rg97\" (UID: \"854b03be-b8cf-4b7a-91bd-9a7c0c8342ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.241285 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/854b03be-b8cf-4b7a-91bd-9a7c0c8342ca-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4rg97\" (UID: \"854b03be-b8cf-4b7a-91bd-9a7c0c8342ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.241478 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/854b03be-b8cf-4b7a-91bd-9a7c0c8342ca-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4rg97\" (UID: \"854b03be-b8cf-4b7a-91bd-9a7c0c8342ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.351810 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/854b03be-b8cf-4b7a-91bd-9a7c0c8342ca-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4rg97\" (UID: \"854b03be-b8cf-4b7a-91bd-9a7c0c8342ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.351887 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7fg4\" (UniqueName: \"kubernetes.io/projected/854b03be-b8cf-4b7a-91bd-9a7c0c8342ca-kube-api-access-z7fg4\") pod \"marketplace-operator-79b997595-4rg97\" (UID: \"854b03be-b8cf-4b7a-91bd-9a7c0c8342ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.351932 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/854b03be-b8cf-4b7a-91bd-9a7c0c8342ca-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4rg97\" (UID: \"854b03be-b8cf-4b7a-91bd-9a7c0c8342ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.353029 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/854b03be-b8cf-4b7a-91bd-9a7c0c8342ca-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4rg97\" (UID: \"854b03be-b8cf-4b7a-91bd-9a7c0c8342ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.362468 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/854b03be-b8cf-4b7a-91bd-9a7c0c8342ca-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4rg97\" (UID: \"854b03be-b8cf-4b7a-91bd-9a7c0c8342ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.387964 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7fg4\" (UniqueName: \"kubernetes.io/projected/854b03be-b8cf-4b7a-91bd-9a7c0c8342ca-kube-api-access-z7fg4\") pod \"marketplace-operator-79b997595-4rg97\" (UID: \"854b03be-b8cf-4b7a-91bd-9a7c0c8342ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.425594 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.609025 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.639110 4992 generic.go:334] "Generic (PLEG): container finished" podID="2cef084e-8345-4b18-ade1-4cc6e9fbfd09" containerID="22291ad67814e70eb10003c0326e63a44ad4978e481eb5ebba9dc6877389b215" exitCode=0 Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.639359 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qtkp" event={"ID":"2cef084e-8345-4b18-ade1-4cc6e9fbfd09","Type":"ContainerDied","Data":"22291ad67814e70eb10003c0326e63a44ad4978e481eb5ebba9dc6877389b215"} Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.655257 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/533c10ab-faa7-4a62-8e8a-2ebd87578ced-utilities\") pod \"533c10ab-faa7-4a62-8e8a-2ebd87578ced\" (UID: \"533c10ab-faa7-4a62-8e8a-2ebd87578ced\") " Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.655347 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb5wd\" (UniqueName: \"kubernetes.io/projected/533c10ab-faa7-4a62-8e8a-2ebd87578ced-kube-api-access-rb5wd\") pod \"533c10ab-faa7-4a62-8e8a-2ebd87578ced\" (UID: \"533c10ab-faa7-4a62-8e8a-2ebd87578ced\") " Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.655373 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/533c10ab-faa7-4a62-8e8a-2ebd87578ced-catalog-content\") pod \"533c10ab-faa7-4a62-8e8a-2ebd87578ced\" (UID: \"533c10ab-faa7-4a62-8e8a-2ebd87578ced\") " Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.656462 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533c10ab-faa7-4a62-8e8a-2ebd87578ced-utilities" (OuterVolumeSpecName: "utilities") pod "533c10ab-faa7-4a62-8e8a-2ebd87578ced" (UID: "533c10ab-faa7-4a62-8e8a-2ebd87578ced"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.659040 4992 generic.go:334] "Generic (PLEG): container finished" podID="533c10ab-faa7-4a62-8e8a-2ebd87578ced" containerID="008c7058149d437aceeb68d97a9b39a578ca8d93a8fc56a613e00c082b93d7e6" exitCode=0 Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.659093 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7j8l" event={"ID":"533c10ab-faa7-4a62-8e8a-2ebd87578ced","Type":"ContainerDied","Data":"008c7058149d437aceeb68d97a9b39a578ca8d93a8fc56a613e00c082b93d7e6"} Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.659120 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7j8l" event={"ID":"533c10ab-faa7-4a62-8e8a-2ebd87578ced","Type":"ContainerDied","Data":"6e9cb256a735d92bc735ce08030d7d9f34ebeb99e0532a0b78364f70d3846e60"} Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.659136 4992 scope.go:117] "RemoveContainer" containerID="008c7058149d437aceeb68d97a9b39a578ca8d93a8fc56a613e00c082b93d7e6" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.659228 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7j8l" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.661978 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/533c10ab-faa7-4a62-8e8a-2ebd87578ced-kube-api-access-rb5wd" (OuterVolumeSpecName: "kube-api-access-rb5wd") pod "533c10ab-faa7-4a62-8e8a-2ebd87578ced" (UID: "533c10ab-faa7-4a62-8e8a-2ebd87578ced"). InnerVolumeSpecName "kube-api-access-rb5wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.663690 4992 generic.go:334] "Generic (PLEG): container finished" podID="42351b07-cf74-49fd-b6fd-88b7ef8fdac0" containerID="b768c51c7370bf0a5e2863e9d4fe57a03fab7b1bcfb19617106bd81eb457d9f1" exitCode=0 Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.663756 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" event={"ID":"42351b07-cf74-49fd-b6fd-88b7ef8fdac0","Type":"ContainerDied","Data":"b768c51c7370bf0a5e2863e9d4fe57a03fab7b1bcfb19617106bd81eb457d9f1"} Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.670582 4992 generic.go:334] "Generic (PLEG): container finished" podID="de317207-ef37-4247-9d1d-279570141ebc" containerID="21d7c088f7dec369814d7bf5a663c8b9dc81e84a1c87207d791830d4bbf75e3b" exitCode=0 Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.670651 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xmbj" event={"ID":"de317207-ef37-4247-9d1d-279570141ebc","Type":"ContainerDied","Data":"21d7c088f7dec369814d7bf5a663c8b9dc81e84a1c87207d791830d4bbf75e3b"} Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.678226 4992 generic.go:334] "Generic (PLEG): container finished" podID="7c4d6b90-976c-46f4-b55f-26d3277cc754" containerID="94afe569ea782071a7e8325cb527ab1edeb13d943b10205b7ea648d164ca4b31" exitCode=0 Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.678293 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4drj6" event={"ID":"7c4d6b90-976c-46f4-b55f-26d3277cc754","Type":"ContainerDied","Data":"94afe569ea782071a7e8325cb527ab1edeb13d943b10205b7ea648d164ca4b31"} Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.694366 4992 scope.go:117] "RemoveContainer" containerID="9a56e528264cae4e293fabbeed8e29d920b65aa95ad6b52b992496e9f9d0be7e" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.755080 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533c10ab-faa7-4a62-8e8a-2ebd87578ced-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "533c10ab-faa7-4a62-8e8a-2ebd87578ced" (UID: "533c10ab-faa7-4a62-8e8a-2ebd87578ced"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.756346 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/533c10ab-faa7-4a62-8e8a-2ebd87578ced-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.756373 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb5wd\" (UniqueName: \"kubernetes.io/projected/533c10ab-faa7-4a62-8e8a-2ebd87578ced-kube-api-access-rb5wd\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.756386 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/533c10ab-faa7-4a62-8e8a-2ebd87578ced-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.784800 4992 scope.go:117] "RemoveContainer" containerID="ced9047484f4014ac1a1d057112a2d14f2b2edb8a49597b312bfbe5817dd32b4" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.786374 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4rg97"] Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.795724 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.801296 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.805743 4992 scope.go:117] "RemoveContainer" containerID="008c7058149d437aceeb68d97a9b39a578ca8d93a8fc56a613e00c082b93d7e6" Jan 31 09:31:31 crc kubenswrapper[4992]: E0131 09:31:31.806822 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"008c7058149d437aceeb68d97a9b39a578ca8d93a8fc56a613e00c082b93d7e6\": container with ID starting with 008c7058149d437aceeb68d97a9b39a578ca8d93a8fc56a613e00c082b93d7e6 not found: ID does not exist" containerID="008c7058149d437aceeb68d97a9b39a578ca8d93a8fc56a613e00c082b93d7e6" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.806854 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"008c7058149d437aceeb68d97a9b39a578ca8d93a8fc56a613e00c082b93d7e6"} err="failed to get container status \"008c7058149d437aceeb68d97a9b39a578ca8d93a8fc56a613e00c082b93d7e6\": rpc error: code = NotFound desc = could not find container \"008c7058149d437aceeb68d97a9b39a578ca8d93a8fc56a613e00c082b93d7e6\": container with ID starting with 008c7058149d437aceeb68d97a9b39a578ca8d93a8fc56a613e00c082b93d7e6 not found: ID does not exist" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.806876 4992 scope.go:117] "RemoveContainer" containerID="9a56e528264cae4e293fabbeed8e29d920b65aa95ad6b52b992496e9f9d0be7e" Jan 31 09:31:31 crc kubenswrapper[4992]: E0131 09:31:31.807546 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a56e528264cae4e293fabbeed8e29d920b65aa95ad6b52b992496e9f9d0be7e\": container with ID starting with 9a56e528264cae4e293fabbeed8e29d920b65aa95ad6b52b992496e9f9d0be7e not found: ID does not exist" containerID="9a56e528264cae4e293fabbeed8e29d920b65aa95ad6b52b992496e9f9d0be7e" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.807585 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a56e528264cae4e293fabbeed8e29d920b65aa95ad6b52b992496e9f9d0be7e"} err="failed to get container status \"9a56e528264cae4e293fabbeed8e29d920b65aa95ad6b52b992496e9f9d0be7e\": rpc error: code = NotFound desc = could not find container \"9a56e528264cae4e293fabbeed8e29d920b65aa95ad6b52b992496e9f9d0be7e\": container with ID starting with 9a56e528264cae4e293fabbeed8e29d920b65aa95ad6b52b992496e9f9d0be7e not found: ID does not exist" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.807608 4992 scope.go:117] "RemoveContainer" containerID="ced9047484f4014ac1a1d057112a2d14f2b2edb8a49597b312bfbe5817dd32b4" Jan 31 09:31:31 crc kubenswrapper[4992]: E0131 09:31:31.808057 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ced9047484f4014ac1a1d057112a2d14f2b2edb8a49597b312bfbe5817dd32b4\": container with ID starting with ced9047484f4014ac1a1d057112a2d14f2b2edb8a49597b312bfbe5817dd32b4 not found: ID does not exist" containerID="ced9047484f4014ac1a1d057112a2d14f2b2edb8a49597b312bfbe5817dd32b4" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.808086 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced9047484f4014ac1a1d057112a2d14f2b2edb8a49597b312bfbe5817dd32b4"} err="failed to get container status \"ced9047484f4014ac1a1d057112a2d14f2b2edb8a49597b312bfbe5817dd32b4\": rpc error: code = NotFound desc = could not find container \"ced9047484f4014ac1a1d057112a2d14f2b2edb8a49597b312bfbe5817dd32b4\": container with ID starting with ced9047484f4014ac1a1d057112a2d14f2b2edb8a49597b312bfbe5817dd32b4 not found: ID does not exist" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.809366 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.810699 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.857383 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de317207-ef37-4247-9d1d-279570141ebc-catalog-content\") pod \"de317207-ef37-4247-9d1d-279570141ebc\" (UID: \"de317207-ef37-4247-9d1d-279570141ebc\") " Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.857481 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c4d6b90-976c-46f4-b55f-26d3277cc754-catalog-content\") pod \"7c4d6b90-976c-46f4-b55f-26d3277cc754\" (UID: \"7c4d6b90-976c-46f4-b55f-26d3277cc754\") " Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.857550 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpf6w\" (UniqueName: \"kubernetes.io/projected/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-kube-api-access-qpf6w\") pod \"42351b07-cf74-49fd-b6fd-88b7ef8fdac0\" (UID: \"42351b07-cf74-49fd-b6fd-88b7ef8fdac0\") " Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.857574 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c4d6b90-976c-46f4-b55f-26d3277cc754-utilities\") pod \"7c4d6b90-976c-46f4-b55f-26d3277cc754\" (UID: \"7c4d6b90-976c-46f4-b55f-26d3277cc754\") " Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.857617 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-marketplace-operator-metrics\") pod \"42351b07-cf74-49fd-b6fd-88b7ef8fdac0\" (UID: \"42351b07-cf74-49fd-b6fd-88b7ef8fdac0\") " Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.857635 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np6ql\" (UniqueName: \"kubernetes.io/projected/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-kube-api-access-np6ql\") pod \"2cef084e-8345-4b18-ade1-4cc6e9fbfd09\" (UID: \"2cef084e-8345-4b18-ade1-4cc6e9fbfd09\") " Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.857658 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9nwc\" (UniqueName: \"kubernetes.io/projected/7c4d6b90-976c-46f4-b55f-26d3277cc754-kube-api-access-k9nwc\") pod \"7c4d6b90-976c-46f4-b55f-26d3277cc754\" (UID: \"7c4d6b90-976c-46f4-b55f-26d3277cc754\") " Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.857804 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-marketplace-trusted-ca\") pod \"42351b07-cf74-49fd-b6fd-88b7ef8fdac0\" (UID: \"42351b07-cf74-49fd-b6fd-88b7ef8fdac0\") " Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.857824 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de317207-ef37-4247-9d1d-279570141ebc-utilities\") pod \"de317207-ef37-4247-9d1d-279570141ebc\" (UID: \"de317207-ef37-4247-9d1d-279570141ebc\") " Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.857844 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-utilities\") pod \"2cef084e-8345-4b18-ade1-4cc6e9fbfd09\" (UID: \"2cef084e-8345-4b18-ade1-4cc6e9fbfd09\") " Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.857881 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdpxs\" (UniqueName: \"kubernetes.io/projected/de317207-ef37-4247-9d1d-279570141ebc-kube-api-access-fdpxs\") pod \"de317207-ef37-4247-9d1d-279570141ebc\" (UID: \"de317207-ef37-4247-9d1d-279570141ebc\") " Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.857914 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-catalog-content\") pod \"2cef084e-8345-4b18-ade1-4cc6e9fbfd09\" (UID: \"2cef084e-8345-4b18-ade1-4cc6e9fbfd09\") " Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.858550 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c4d6b90-976c-46f4-b55f-26d3277cc754-utilities" (OuterVolumeSpecName: "utilities") pod "7c4d6b90-976c-46f4-b55f-26d3277cc754" (UID: "7c4d6b90-976c-46f4-b55f-26d3277cc754"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.861519 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "42351b07-cf74-49fd-b6fd-88b7ef8fdac0" (UID: "42351b07-cf74-49fd-b6fd-88b7ef8fdac0"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.862678 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-utilities" (OuterVolumeSpecName: "utilities") pod "2cef084e-8345-4b18-ade1-4cc6e9fbfd09" (UID: "2cef084e-8345-4b18-ade1-4cc6e9fbfd09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.865975 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de317207-ef37-4247-9d1d-279570141ebc-kube-api-access-fdpxs" (OuterVolumeSpecName: "kube-api-access-fdpxs") pod "de317207-ef37-4247-9d1d-279570141ebc" (UID: "de317207-ef37-4247-9d1d-279570141ebc"). InnerVolumeSpecName "kube-api-access-fdpxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.867037 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "42351b07-cf74-49fd-b6fd-88b7ef8fdac0" (UID: "42351b07-cf74-49fd-b6fd-88b7ef8fdac0"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.867626 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-kube-api-access-qpf6w" (OuterVolumeSpecName: "kube-api-access-qpf6w") pod "42351b07-cf74-49fd-b6fd-88b7ef8fdac0" (UID: "42351b07-cf74-49fd-b6fd-88b7ef8fdac0"). InnerVolumeSpecName "kube-api-access-qpf6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.870411 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de317207-ef37-4247-9d1d-279570141ebc-utilities" (OuterVolumeSpecName: "utilities") pod "de317207-ef37-4247-9d1d-279570141ebc" (UID: "de317207-ef37-4247-9d1d-279570141ebc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.870787 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c4d6b90-976c-46f4-b55f-26d3277cc754-kube-api-access-k9nwc" (OuterVolumeSpecName: "kube-api-access-k9nwc") pod "7c4d6b90-976c-46f4-b55f-26d3277cc754" (UID: "7c4d6b90-976c-46f4-b55f-26d3277cc754"). InnerVolumeSpecName "kube-api-access-k9nwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.874259 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-kube-api-access-np6ql" (OuterVolumeSpecName: "kube-api-access-np6ql") pod "2cef084e-8345-4b18-ade1-4cc6e9fbfd09" (UID: "2cef084e-8345-4b18-ade1-4cc6e9fbfd09"). InnerVolumeSpecName "kube-api-access-np6ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.912724 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2cef084e-8345-4b18-ade1-4cc6e9fbfd09" (UID: "2cef084e-8345-4b18-ade1-4cc6e9fbfd09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.921584 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c4d6b90-976c-46f4-b55f-26d3277cc754-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c4d6b90-976c-46f4-b55f-26d3277cc754" (UID: "7c4d6b90-976c-46f4-b55f-26d3277cc754"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.959891 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdpxs\" (UniqueName: \"kubernetes.io/projected/de317207-ef37-4247-9d1d-279570141ebc-kube-api-access-fdpxs\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.959922 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.959932 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c4d6b90-976c-46f4-b55f-26d3277cc754-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.959940 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpf6w\" (UniqueName: \"kubernetes.io/projected/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-kube-api-access-qpf6w\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.959948 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c4d6b90-976c-46f4-b55f-26d3277cc754-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.959957 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np6ql\" (UniqueName: \"kubernetes.io/projected/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-kube-api-access-np6ql\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.960001 4992 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.960015 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9nwc\" (UniqueName: \"kubernetes.io/projected/7c4d6b90-976c-46f4-b55f-26d3277cc754-kube-api-access-k9nwc\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.960027 4992 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42351b07-cf74-49fd-b6fd-88b7ef8fdac0-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.960037 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de317207-ef37-4247-9d1d-279570141ebc-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.960047 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cef084e-8345-4b18-ade1-4cc6e9fbfd09-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:31 crc kubenswrapper[4992]: I0131 09:31:31.997531 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b7j8l"] Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.001145 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b7j8l"] Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.034180 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de317207-ef37-4247-9d1d-279570141ebc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de317207-ef37-4247-9d1d-279570141ebc" (UID: "de317207-ef37-4247-9d1d-279570141ebc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.061646 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de317207-ef37-4247-9d1d-279570141ebc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.685706 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xmbj" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.685701 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xmbj" event={"ID":"de317207-ef37-4247-9d1d-279570141ebc","Type":"ContainerDied","Data":"6c1967a90e05dc91b15c267515def712dc0b9d38117d8641165a7e0c2c01e9eb"} Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.685831 4992 scope.go:117] "RemoveContainer" containerID="21d7c088f7dec369814d7bf5a663c8b9dc81e84a1c87207d791830d4bbf75e3b" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.689572 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4drj6" event={"ID":"7c4d6b90-976c-46f4-b55f-26d3277cc754","Type":"ContainerDied","Data":"9c80b324340349f5b257c78500fc07a5541884ace517d1a85b26be695b5aabdb"} Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.689627 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4drj6" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.692320 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2qtkp" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.692578 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2qtkp" event={"ID":"2cef084e-8345-4b18-ade1-4cc6e9fbfd09","Type":"ContainerDied","Data":"344c33b97e18935f1cbf913477a3d0add281956c5629bb4bf4ab4ee8b565822f"} Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.694580 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" event={"ID":"854b03be-b8cf-4b7a-91bd-9a7c0c8342ca","Type":"ContainerStarted","Data":"41329fcd49e1f83211c1de5e46927b7624210d40938f9634a3ccdc5d8ddb5f7a"} Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.694610 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" event={"ID":"854b03be-b8cf-4b7a-91bd-9a7c0c8342ca","Type":"ContainerStarted","Data":"333fdb95db236dfa7bf05286324fee26c5b2bfddca0434eb451e6b9e72f78934"} Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.694821 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.702770 4992 scope.go:117] "RemoveContainer" containerID="24d6f56d5d7e13f415e7cf4b2f7dd19d69d56877714e1c5f80840cdef75dd50d" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.705958 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.707947 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" event={"ID":"42351b07-cf74-49fd-b6fd-88b7ef8fdac0","Type":"ContainerDied","Data":"7ab41c0f05464b651a1c86b5589e7401ff8a843a151d37e96a601a5b6a14a2e4"} Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.708100 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-r68rm" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.723960 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4rg97" podStartSLOduration=1.723940755 podStartE2EDuration="1.723940755s" podCreationTimestamp="2026-01-31 09:31:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:31:32.719718538 +0000 UTC m=+388.691110545" watchObservedRunningTime="2026-01-31 09:31:32.723940755 +0000 UTC m=+388.695332742" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.737214 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9xmbj"] Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.746763 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9xmbj"] Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.762912 4992 scope.go:117] "RemoveContainer" containerID="8efb2be9790413a51a2acd66b07b3b93c1f5d281f451f9a0a9a256e6d5cbce3e" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.779623 4992 scope.go:117] "RemoveContainer" containerID="94afe569ea782071a7e8325cb527ab1edeb13d943b10205b7ea648d164ca4b31" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.783472 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4drj6"] Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.789132 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4drj6"] Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.801882 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qtkp"] Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.807208 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2qtkp"] Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.814471 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r68rm"] Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.815889 4992 scope.go:117] "RemoveContainer" containerID="81713a8db7fcfe094a259706cd6ae6883b242b7e587950ba9bffdbf43c3652c1" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.817978 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-r68rm"] Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.836960 4992 scope.go:117] "RemoveContainer" containerID="857fe9d991ab2b833c2200c5b86fc8fa696ba81321cc86eda3e69d43f43e688b" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.854737 4992 scope.go:117] "RemoveContainer" containerID="22291ad67814e70eb10003c0326e63a44ad4978e481eb5ebba9dc6877389b215" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.866005 4992 scope.go:117] "RemoveContainer" containerID="4ce74399488f1e5ead804bf88ad0d77ab22df4d92da4f1e6c3adcdd63a632172" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.877978 4992 scope.go:117] "RemoveContainer" containerID="3794f68ad026ec339d98f3b94995a7847285f64c04bc04d5b9b1b3e3c2cbae91" Jan 31 09:31:32 crc kubenswrapper[4992]: I0131 09:31:32.891114 4992 scope.go:117] "RemoveContainer" containerID="b768c51c7370bf0a5e2863e9d4fe57a03fab7b1bcfb19617106bd81eb457d9f1" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.193378 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cef084e-8345-4b18-ade1-4cc6e9fbfd09" path="/var/lib/kubelet/pods/2cef084e-8345-4b18-ade1-4cc6e9fbfd09/volumes" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.194472 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42351b07-cf74-49fd-b6fd-88b7ef8fdac0" path="/var/lib/kubelet/pods/42351b07-cf74-49fd-b6fd-88b7ef8fdac0/volumes" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.195014 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="533c10ab-faa7-4a62-8e8a-2ebd87578ced" path="/var/lib/kubelet/pods/533c10ab-faa7-4a62-8e8a-2ebd87578ced/volumes" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.196216 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c4d6b90-976c-46f4-b55f-26d3277cc754" path="/var/lib/kubelet/pods/7c4d6b90-976c-46f4-b55f-26d3277cc754/volumes" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.196910 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de317207-ef37-4247-9d1d-279570141ebc" path="/var/lib/kubelet/pods/de317207-ef37-4247-9d1d-279570141ebc/volumes" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271243 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8zs9s"] Jan 31 09:31:33 crc kubenswrapper[4992]: E0131 09:31:33.271485 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de317207-ef37-4247-9d1d-279570141ebc" containerName="extract-content" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271499 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="de317207-ef37-4247-9d1d-279570141ebc" containerName="extract-content" Jan 31 09:31:33 crc kubenswrapper[4992]: E0131 09:31:33.271512 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533c10ab-faa7-4a62-8e8a-2ebd87578ced" containerName="extract-content" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271519 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="533c10ab-faa7-4a62-8e8a-2ebd87578ced" containerName="extract-content" Jan 31 09:31:33 crc kubenswrapper[4992]: E0131 09:31:33.271526 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c4d6b90-976c-46f4-b55f-26d3277cc754" containerName="extract-content" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271532 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c4d6b90-976c-46f4-b55f-26d3277cc754" containerName="extract-content" Jan 31 09:31:33 crc kubenswrapper[4992]: E0131 09:31:33.271542 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533c10ab-faa7-4a62-8e8a-2ebd87578ced" containerName="registry-server" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271548 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="533c10ab-faa7-4a62-8e8a-2ebd87578ced" containerName="registry-server" Jan 31 09:31:33 crc kubenswrapper[4992]: E0131 09:31:33.271555 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cef084e-8345-4b18-ade1-4cc6e9fbfd09" containerName="registry-server" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271562 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cef084e-8345-4b18-ade1-4cc6e9fbfd09" containerName="registry-server" Jan 31 09:31:33 crc kubenswrapper[4992]: E0131 09:31:33.271573 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de317207-ef37-4247-9d1d-279570141ebc" containerName="registry-server" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271580 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="de317207-ef37-4247-9d1d-279570141ebc" containerName="registry-server" Jan 31 09:31:33 crc kubenswrapper[4992]: E0131 09:31:33.271588 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42351b07-cf74-49fd-b6fd-88b7ef8fdac0" containerName="marketplace-operator" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271594 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="42351b07-cf74-49fd-b6fd-88b7ef8fdac0" containerName="marketplace-operator" Jan 31 09:31:33 crc kubenswrapper[4992]: E0131 09:31:33.271601 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cef084e-8345-4b18-ade1-4cc6e9fbfd09" containerName="extract-utilities" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271607 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cef084e-8345-4b18-ade1-4cc6e9fbfd09" containerName="extract-utilities" Jan 31 09:31:33 crc kubenswrapper[4992]: E0131 09:31:33.271614 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de317207-ef37-4247-9d1d-279570141ebc" containerName="extract-utilities" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271620 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="de317207-ef37-4247-9d1d-279570141ebc" containerName="extract-utilities" Jan 31 09:31:33 crc kubenswrapper[4992]: E0131 09:31:33.271628 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533c10ab-faa7-4a62-8e8a-2ebd87578ced" containerName="extract-utilities" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271634 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="533c10ab-faa7-4a62-8e8a-2ebd87578ced" containerName="extract-utilities" Jan 31 09:31:33 crc kubenswrapper[4992]: E0131 09:31:33.271643 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cef084e-8345-4b18-ade1-4cc6e9fbfd09" containerName="extract-content" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271649 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cef084e-8345-4b18-ade1-4cc6e9fbfd09" containerName="extract-content" Jan 31 09:31:33 crc kubenswrapper[4992]: E0131 09:31:33.271655 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c4d6b90-976c-46f4-b55f-26d3277cc754" containerName="registry-server" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271661 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c4d6b90-976c-46f4-b55f-26d3277cc754" containerName="registry-server" Jan 31 09:31:33 crc kubenswrapper[4992]: E0131 09:31:33.271672 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c4d6b90-976c-46f4-b55f-26d3277cc754" containerName="extract-utilities" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271678 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c4d6b90-976c-46f4-b55f-26d3277cc754" containerName="extract-utilities" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271754 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="533c10ab-faa7-4a62-8e8a-2ebd87578ced" containerName="registry-server" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271764 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cef084e-8345-4b18-ade1-4cc6e9fbfd09" containerName="registry-server" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271770 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="42351b07-cf74-49fd-b6fd-88b7ef8fdac0" containerName="marketplace-operator" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271779 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="de317207-ef37-4247-9d1d-279570141ebc" containerName="registry-server" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.271793 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c4d6b90-976c-46f4-b55f-26d3277cc754" containerName="registry-server" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.272572 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zs9s" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.275140 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.285701 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zs9s"] Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.376834 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr954\" (UniqueName: \"kubernetes.io/projected/265a14af-f30c-46a1-9618-e3b1e406f841-kube-api-access-dr954\") pod \"redhat-marketplace-8zs9s\" (UID: \"265a14af-f30c-46a1-9618-e3b1e406f841\") " pod="openshift-marketplace/redhat-marketplace-8zs9s" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.377111 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265a14af-f30c-46a1-9618-e3b1e406f841-utilities\") pod \"redhat-marketplace-8zs9s\" (UID: \"265a14af-f30c-46a1-9618-e3b1e406f841\") " pod="openshift-marketplace/redhat-marketplace-8zs9s" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.377238 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265a14af-f30c-46a1-9618-e3b1e406f841-catalog-content\") pod \"redhat-marketplace-8zs9s\" (UID: \"265a14af-f30c-46a1-9618-e3b1e406f841\") " pod="openshift-marketplace/redhat-marketplace-8zs9s" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.471156 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fgmqt"] Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.472377 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgmqt" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.474878 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.478293 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr954\" (UniqueName: \"kubernetes.io/projected/265a14af-f30c-46a1-9618-e3b1e406f841-kube-api-access-dr954\") pod \"redhat-marketplace-8zs9s\" (UID: \"265a14af-f30c-46a1-9618-e3b1e406f841\") " pod="openshift-marketplace/redhat-marketplace-8zs9s" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.478549 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265a14af-f30c-46a1-9618-e3b1e406f841-utilities\") pod \"redhat-marketplace-8zs9s\" (UID: \"265a14af-f30c-46a1-9618-e3b1e406f841\") " pod="openshift-marketplace/redhat-marketplace-8zs9s" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.478673 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265a14af-f30c-46a1-9618-e3b1e406f841-catalog-content\") pod \"redhat-marketplace-8zs9s\" (UID: \"265a14af-f30c-46a1-9618-e3b1e406f841\") " pod="openshift-marketplace/redhat-marketplace-8zs9s" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.479384 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265a14af-f30c-46a1-9618-e3b1e406f841-utilities\") pod \"redhat-marketplace-8zs9s\" (UID: \"265a14af-f30c-46a1-9618-e3b1e406f841\") " pod="openshift-marketplace/redhat-marketplace-8zs9s" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.479586 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265a14af-f30c-46a1-9618-e3b1e406f841-catalog-content\") pod \"redhat-marketplace-8zs9s\" (UID: \"265a14af-f30c-46a1-9618-e3b1e406f841\") " pod="openshift-marketplace/redhat-marketplace-8zs9s" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.484454 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fgmqt"] Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.512331 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr954\" (UniqueName: \"kubernetes.io/projected/265a14af-f30c-46a1-9618-e3b1e406f841-kube-api-access-dr954\") pod \"redhat-marketplace-8zs9s\" (UID: \"265a14af-f30c-46a1-9618-e3b1e406f841\") " pod="openshift-marketplace/redhat-marketplace-8zs9s" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.579257 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27fr4\" (UniqueName: \"kubernetes.io/projected/ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977-kube-api-access-27fr4\") pod \"community-operators-fgmqt\" (UID: \"ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977\") " pod="openshift-marketplace/community-operators-fgmqt" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.579381 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977-catalog-content\") pod \"community-operators-fgmqt\" (UID: \"ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977\") " pod="openshift-marketplace/community-operators-fgmqt" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.579443 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977-utilities\") pod \"community-operators-fgmqt\" (UID: \"ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977\") " pod="openshift-marketplace/community-operators-fgmqt" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.594573 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zs9s" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.680362 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977-catalog-content\") pod \"community-operators-fgmqt\" (UID: \"ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977\") " pod="openshift-marketplace/community-operators-fgmqt" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.680502 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977-utilities\") pod \"community-operators-fgmqt\" (UID: \"ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977\") " pod="openshift-marketplace/community-operators-fgmqt" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.680536 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27fr4\" (UniqueName: \"kubernetes.io/projected/ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977-kube-api-access-27fr4\") pod \"community-operators-fgmqt\" (UID: \"ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977\") " pod="openshift-marketplace/community-operators-fgmqt" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.681045 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977-catalog-content\") pod \"community-operators-fgmqt\" (UID: \"ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977\") " pod="openshift-marketplace/community-operators-fgmqt" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.681243 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977-utilities\") pod \"community-operators-fgmqt\" (UID: \"ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977\") " pod="openshift-marketplace/community-operators-fgmqt" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.704016 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27fr4\" (UniqueName: \"kubernetes.io/projected/ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977-kube-api-access-27fr4\") pod \"community-operators-fgmqt\" (UID: \"ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977\") " pod="openshift-marketplace/community-operators-fgmqt" Jan 31 09:31:33 crc kubenswrapper[4992]: I0131 09:31:33.791156 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgmqt" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.003314 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zs9s"] Jan 31 09:31:34 crc kubenswrapper[4992]: W0131 09:31:34.005315 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod265a14af_f30c_46a1_9618_e3b1e406f841.slice/crio-a7ff25d91fdef07b5b811f631b0276aaeee8669b109d69fab4f81b62badd932c WatchSource:0}: Error finding container a7ff25d91fdef07b5b811f631b0276aaeee8669b109d69fab4f81b62badd932c: Status 404 returned error can't find the container with id a7ff25d91fdef07b5b811f631b0276aaeee8669b109d69fab4f81b62badd932c Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.165211 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fgmqt"] Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.196911 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b69b979b6-sqh5d"] Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.197113 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" podUID="919d419e-3b0e-4133-87ec-4dca2cd10320" containerName="controller-manager" containerID="cri-o://3cc27a170995f3dc8f1bddee03000e80892b6130660a0d686667b86c0e998b85" gracePeriod=30 Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.216147 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w"] Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.216347 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" podUID="de87c20e-fabd-4c43-a3d3-9b401a1db0e6" containerName="route-controller-manager" containerID="cri-o://0500de181d443cd42892d83c22098f8bf8dd9647940956518acbcab628d4d494" gracePeriod=30 Jan 31 09:31:34 crc kubenswrapper[4992]: W0131 09:31:34.222009 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecb8ad5d_71b3_4c8f_adc9_c0b3bf227977.slice/crio-5a8b0d873bfac03d6088106228e2b4846dad97c9ed90c4f19f8b540680ca07ea WatchSource:0}: Error finding container 5a8b0d873bfac03d6088106228e2b4846dad97c9ed90c4f19f8b540680ca07ea: Status 404 returned error can't find the container with id 5a8b0d873bfac03d6088106228e2b4846dad97c9ed90c4f19f8b540680ca07ea Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.614199 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.620934 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.702656 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-serving-cert\") pod \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.702703 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-client-ca\") pod \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.702728 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-config\") pod \"919d419e-3b0e-4133-87ec-4dca2cd10320\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.702748 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-proxy-ca-bundles\") pod \"919d419e-3b0e-4133-87ec-4dca2cd10320\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.702765 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/919d419e-3b0e-4133-87ec-4dca2cd10320-serving-cert\") pod \"919d419e-3b0e-4133-87ec-4dca2cd10320\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.702798 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-client-ca\") pod \"919d419e-3b0e-4133-87ec-4dca2cd10320\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.702821 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-config\") pod \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.702843 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgkss\" (UniqueName: \"kubernetes.io/projected/919d419e-3b0e-4133-87ec-4dca2cd10320-kube-api-access-bgkss\") pod \"919d419e-3b0e-4133-87ec-4dca2cd10320\" (UID: \"919d419e-3b0e-4133-87ec-4dca2cd10320\") " Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.702863 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsfl6\" (UniqueName: \"kubernetes.io/projected/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-kube-api-access-zsfl6\") pod \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\" (UID: \"de87c20e-fabd-4c43-a3d3-9b401a1db0e6\") " Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.704626 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-client-ca" (OuterVolumeSpecName: "client-ca") pod "919d419e-3b0e-4133-87ec-4dca2cd10320" (UID: "919d419e-3b0e-4133-87ec-4dca2cd10320"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.704639 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "919d419e-3b0e-4133-87ec-4dca2cd10320" (UID: "919d419e-3b0e-4133-87ec-4dca2cd10320"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.705130 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-config" (OuterVolumeSpecName: "config") pod "919d419e-3b0e-4133-87ec-4dca2cd10320" (UID: "919d419e-3b0e-4133-87ec-4dca2cd10320"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.705158 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-config" (OuterVolumeSpecName: "config") pod "de87c20e-fabd-4c43-a3d3-9b401a1db0e6" (UID: "de87c20e-fabd-4c43-a3d3-9b401a1db0e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.705542 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-client-ca" (OuterVolumeSpecName: "client-ca") pod "de87c20e-fabd-4c43-a3d3-9b401a1db0e6" (UID: "de87c20e-fabd-4c43-a3d3-9b401a1db0e6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.708401 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "de87c20e-fabd-4c43-a3d3-9b401a1db0e6" (UID: "de87c20e-fabd-4c43-a3d3-9b401a1db0e6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.708406 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-kube-api-access-zsfl6" (OuterVolumeSpecName: "kube-api-access-zsfl6") pod "de87c20e-fabd-4c43-a3d3-9b401a1db0e6" (UID: "de87c20e-fabd-4c43-a3d3-9b401a1db0e6"). InnerVolumeSpecName "kube-api-access-zsfl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.708572 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919d419e-3b0e-4133-87ec-4dca2cd10320-kube-api-access-bgkss" (OuterVolumeSpecName: "kube-api-access-bgkss") pod "919d419e-3b0e-4133-87ec-4dca2cd10320" (UID: "919d419e-3b0e-4133-87ec-4dca2cd10320"). InnerVolumeSpecName "kube-api-access-bgkss". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.708665 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919d419e-3b0e-4133-87ec-4dca2cd10320-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "919d419e-3b0e-4133-87ec-4dca2cd10320" (UID: "919d419e-3b0e-4133-87ec-4dca2cd10320"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.723566 4992 generic.go:334] "Generic (PLEG): container finished" podID="919d419e-3b0e-4133-87ec-4dca2cd10320" containerID="3cc27a170995f3dc8f1bddee03000e80892b6130660a0d686667b86c0e998b85" exitCode=0 Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.723624 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.723967 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" event={"ID":"919d419e-3b0e-4133-87ec-4dca2cd10320","Type":"ContainerDied","Data":"3cc27a170995f3dc8f1bddee03000e80892b6130660a0d686667b86c0e998b85"} Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.724139 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b69b979b6-sqh5d" event={"ID":"919d419e-3b0e-4133-87ec-4dca2cd10320","Type":"ContainerDied","Data":"a996948bb52aeee4241f739b19b607975ab5d800569aa5a5c6deb622e8f536be"} Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.724195 4992 scope.go:117] "RemoveContainer" containerID="3cc27a170995f3dc8f1bddee03000e80892b6130660a0d686667b86c0e998b85" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.727599 4992 generic.go:334] "Generic (PLEG): container finished" podID="265a14af-f30c-46a1-9618-e3b1e406f841" containerID="e6b41ec6c3149c2f0df2e4a74db45a01c7729b9b16fa6cd0e50749e9aea0d338" exitCode=0 Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.727679 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zs9s" event={"ID":"265a14af-f30c-46a1-9618-e3b1e406f841","Type":"ContainerDied","Data":"e6b41ec6c3149c2f0df2e4a74db45a01c7729b9b16fa6cd0e50749e9aea0d338"} Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.727717 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zs9s" event={"ID":"265a14af-f30c-46a1-9618-e3b1e406f841","Type":"ContainerStarted","Data":"a7ff25d91fdef07b5b811f631b0276aaeee8669b109d69fab4f81b62badd932c"} Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.730147 4992 generic.go:334] "Generic (PLEG): container finished" podID="ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977" containerID="233b8f3587a583e07d7066f5e912664f33dd27f9ed814a3d58ac7089addad4a1" exitCode=0 Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.730190 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgmqt" event={"ID":"ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977","Type":"ContainerDied","Data":"233b8f3587a583e07d7066f5e912664f33dd27f9ed814a3d58ac7089addad4a1"} Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.730358 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgmqt" event={"ID":"ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977","Type":"ContainerStarted","Data":"5a8b0d873bfac03d6088106228e2b4846dad97c9ed90c4f19f8b540680ca07ea"} Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.738813 4992 generic.go:334] "Generic (PLEG): container finished" podID="de87c20e-fabd-4c43-a3d3-9b401a1db0e6" containerID="0500de181d443cd42892d83c22098f8bf8dd9647940956518acbcab628d4d494" exitCode=0 Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.738863 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.738892 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" event={"ID":"de87c20e-fabd-4c43-a3d3-9b401a1db0e6","Type":"ContainerDied","Data":"0500de181d443cd42892d83c22098f8bf8dd9647940956518acbcab628d4d494"} Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.738922 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w" event={"ID":"de87c20e-fabd-4c43-a3d3-9b401a1db0e6","Type":"ContainerDied","Data":"a6da05283debcbb0cc4e7f363ecc86bf3afb3d0011037faed5cf116e31bb4394"} Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.745258 4992 scope.go:117] "RemoveContainer" containerID="3cc27a170995f3dc8f1bddee03000e80892b6130660a0d686667b86c0e998b85" Jan 31 09:31:34 crc kubenswrapper[4992]: E0131 09:31:34.745999 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cc27a170995f3dc8f1bddee03000e80892b6130660a0d686667b86c0e998b85\": container with ID starting with 3cc27a170995f3dc8f1bddee03000e80892b6130660a0d686667b86c0e998b85 not found: ID does not exist" containerID="3cc27a170995f3dc8f1bddee03000e80892b6130660a0d686667b86c0e998b85" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.746198 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cc27a170995f3dc8f1bddee03000e80892b6130660a0d686667b86c0e998b85"} err="failed to get container status \"3cc27a170995f3dc8f1bddee03000e80892b6130660a0d686667b86c0e998b85\": rpc error: code = NotFound desc = could not find container \"3cc27a170995f3dc8f1bddee03000e80892b6130660a0d686667b86c0e998b85\": container with ID starting with 3cc27a170995f3dc8f1bddee03000e80892b6130660a0d686667b86c0e998b85 not found: ID does not exist" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.746494 4992 scope.go:117] "RemoveContainer" containerID="0500de181d443cd42892d83c22098f8bf8dd9647940956518acbcab628d4d494" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.769579 4992 scope.go:117] "RemoveContainer" containerID="0500de181d443cd42892d83c22098f8bf8dd9647940956518acbcab628d4d494" Jan 31 09:31:34 crc kubenswrapper[4992]: E0131 09:31:34.770044 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0500de181d443cd42892d83c22098f8bf8dd9647940956518acbcab628d4d494\": container with ID starting with 0500de181d443cd42892d83c22098f8bf8dd9647940956518acbcab628d4d494 not found: ID does not exist" containerID="0500de181d443cd42892d83c22098f8bf8dd9647940956518acbcab628d4d494" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.770095 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0500de181d443cd42892d83c22098f8bf8dd9647940956518acbcab628d4d494"} err="failed to get container status \"0500de181d443cd42892d83c22098f8bf8dd9647940956518acbcab628d4d494\": rpc error: code = NotFound desc = could not find container \"0500de181d443cd42892d83c22098f8bf8dd9647940956518acbcab628d4d494\": container with ID starting with 0500de181d443cd42892d83c22098f8bf8dd9647940956518acbcab628d4d494 not found: ID does not exist" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.773613 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b69b979b6-sqh5d"] Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.783590 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b69b979b6-sqh5d"] Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.796391 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w"] Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.799387 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b7c494b96-x7v4w"] Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.804045 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.804217 4992 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.804328 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/919d419e-3b0e-4133-87ec-4dca2cd10320-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.804444 4992 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/919d419e-3b0e-4133-87ec-4dca2cd10320-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.804556 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.804666 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgkss\" (UniqueName: \"kubernetes.io/projected/919d419e-3b0e-4133-87ec-4dca2cd10320-kube-api-access-bgkss\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.804768 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsfl6\" (UniqueName: \"kubernetes.io/projected/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-kube-api-access-zsfl6\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.804953 4992 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:34 crc kubenswrapper[4992]: I0131 09:31:34.805054 4992 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de87c20e-fabd-4c43-a3d3-9b401a1db0e6-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.193809 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="919d419e-3b0e-4133-87ec-4dca2cd10320" path="/var/lib/kubelet/pods/919d419e-3b0e-4133-87ec-4dca2cd10320/volumes" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.194527 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de87c20e-fabd-4c43-a3d3-9b401a1db0e6" path="/var/lib/kubelet/pods/de87c20e-fabd-4c43-a3d3-9b401a1db0e6/volumes" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.669726 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v7k4c"] Jan 31 09:31:35 crc kubenswrapper[4992]: E0131 09:31:35.670267 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919d419e-3b0e-4133-87ec-4dca2cd10320" containerName="controller-manager" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.670282 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="919d419e-3b0e-4133-87ec-4dca2cd10320" containerName="controller-manager" Jan 31 09:31:35 crc kubenswrapper[4992]: E0131 09:31:35.670301 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de87c20e-fabd-4c43-a3d3-9b401a1db0e6" containerName="route-controller-manager" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.670309 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="de87c20e-fabd-4c43-a3d3-9b401a1db0e6" containerName="route-controller-manager" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.670436 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="919d419e-3b0e-4133-87ec-4dca2cd10320" containerName="controller-manager" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.670455 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="de87c20e-fabd-4c43-a3d3-9b401a1db0e6" containerName="route-controller-manager" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.671162 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7k4c" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.673021 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.680763 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7k4c"] Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.746312 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd"] Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.747564 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.749509 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq"] Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.749995 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.750103 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.752961 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.753022 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.753160 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.753513 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.753685 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.754318 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.755238 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.755349 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.755463 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.755569 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.757997 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.761109 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.769453 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd"] Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.773081 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq"] Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.824361 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e0ae4d-f1ed-4368-a7b9-b2273ab80827-catalog-content\") pod \"certified-operators-v7k4c\" (UID: \"26e0ae4d-f1ed-4368-a7b9-b2273ab80827\") " pod="openshift-marketplace/certified-operators-v7k4c" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.824626 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e0ae4d-f1ed-4368-a7b9-b2273ab80827-utilities\") pod \"certified-operators-v7k4c\" (UID: \"26e0ae4d-f1ed-4368-a7b9-b2273ab80827\") " pod="openshift-marketplace/certified-operators-v7k4c" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.824711 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7nmd\" (UniqueName: \"kubernetes.io/projected/26e0ae4d-f1ed-4368-a7b9-b2273ab80827-kube-api-access-x7nmd\") pod \"certified-operators-v7k4c\" (UID: \"26e0ae4d-f1ed-4368-a7b9-b2273ab80827\") " pod="openshift-marketplace/certified-operators-v7k4c" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.873325 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gp6tz"] Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.874486 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp6tz" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.877068 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.884094 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gp6tz"] Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.925808 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75dffdea-9f1c-4648-8a59-ed96c2890b61-config\") pod \"route-controller-manager-574484fcb5-frxqq\" (UID: \"75dffdea-9f1c-4648-8a59-ed96c2890b61\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.925864 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de5841b7-3157-4bac-823b-d94d72103524-client-ca\") pod \"controller-manager-6dd6bcf659-wj8wd\" (UID: \"de5841b7-3157-4bac-823b-d94d72103524\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.925890 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de5841b7-3157-4bac-823b-d94d72103524-config\") pod \"controller-manager-6dd6bcf659-wj8wd\" (UID: \"de5841b7-3157-4bac-823b-d94d72103524\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.925918 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e0ae4d-f1ed-4368-a7b9-b2273ab80827-utilities\") pod \"certified-operators-v7k4c\" (UID: \"26e0ae4d-f1ed-4368-a7b9-b2273ab80827\") " pod="openshift-marketplace/certified-operators-v7k4c" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.925939 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7nmd\" (UniqueName: \"kubernetes.io/projected/26e0ae4d-f1ed-4368-a7b9-b2273ab80827-kube-api-access-x7nmd\") pod \"certified-operators-v7k4c\" (UID: \"26e0ae4d-f1ed-4368-a7b9-b2273ab80827\") " pod="openshift-marketplace/certified-operators-v7k4c" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.925955 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75dffdea-9f1c-4648-8a59-ed96c2890b61-client-ca\") pod \"route-controller-manager-574484fcb5-frxqq\" (UID: \"75dffdea-9f1c-4648-8a59-ed96c2890b61\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.925988 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75dffdea-9f1c-4648-8a59-ed96c2890b61-serving-cert\") pod \"route-controller-manager-574484fcb5-frxqq\" (UID: \"75dffdea-9f1c-4648-8a59-ed96c2890b61\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.926007 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de5841b7-3157-4bac-823b-d94d72103524-serving-cert\") pod \"controller-manager-6dd6bcf659-wj8wd\" (UID: \"de5841b7-3157-4bac-823b-d94d72103524\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.926021 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bc2n\" (UniqueName: \"kubernetes.io/projected/75dffdea-9f1c-4648-8a59-ed96c2890b61-kube-api-access-7bc2n\") pod \"route-controller-manager-574484fcb5-frxqq\" (UID: \"75dffdea-9f1c-4648-8a59-ed96c2890b61\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.926038 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de5841b7-3157-4bac-823b-d94d72103524-proxy-ca-bundles\") pod \"controller-manager-6dd6bcf659-wj8wd\" (UID: \"de5841b7-3157-4bac-823b-d94d72103524\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.926453 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e0ae4d-f1ed-4368-a7b9-b2273ab80827-utilities\") pod \"certified-operators-v7k4c\" (UID: \"26e0ae4d-f1ed-4368-a7b9-b2273ab80827\") " pod="openshift-marketplace/certified-operators-v7k4c" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.926830 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkmgz\" (UniqueName: \"kubernetes.io/projected/de5841b7-3157-4bac-823b-d94d72103524-kube-api-access-vkmgz\") pod \"controller-manager-6dd6bcf659-wj8wd\" (UID: \"de5841b7-3157-4bac-823b-d94d72103524\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.926866 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e0ae4d-f1ed-4368-a7b9-b2273ab80827-catalog-content\") pod \"certified-operators-v7k4c\" (UID: \"26e0ae4d-f1ed-4368-a7b9-b2273ab80827\") " pod="openshift-marketplace/certified-operators-v7k4c" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.927110 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e0ae4d-f1ed-4368-a7b9-b2273ab80827-catalog-content\") pod \"certified-operators-v7k4c\" (UID: \"26e0ae4d-f1ed-4368-a7b9-b2273ab80827\") " pod="openshift-marketplace/certified-operators-v7k4c" Jan 31 09:31:35 crc kubenswrapper[4992]: I0131 09:31:35.947753 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7nmd\" (UniqueName: \"kubernetes.io/projected/26e0ae4d-f1ed-4368-a7b9-b2273ab80827-kube-api-access-x7nmd\") pod \"certified-operators-v7k4c\" (UID: \"26e0ae4d-f1ed-4368-a7b9-b2273ab80827\") " pod="openshift-marketplace/certified-operators-v7k4c" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.028727 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de53be86-6678-489f-9308-9379267f3295-utilities\") pod \"redhat-operators-gp6tz\" (UID: \"de53be86-6678-489f-9308-9379267f3295\") " pod="openshift-marketplace/redhat-operators-gp6tz" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.028808 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de5841b7-3157-4bac-823b-d94d72103524-config\") pod \"controller-manager-6dd6bcf659-wj8wd\" (UID: \"de5841b7-3157-4bac-823b-d94d72103524\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.028904 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75dffdea-9f1c-4648-8a59-ed96c2890b61-client-ca\") pod \"route-controller-manager-574484fcb5-frxqq\" (UID: \"75dffdea-9f1c-4648-8a59-ed96c2890b61\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.030112 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75dffdea-9f1c-4648-8a59-ed96c2890b61-client-ca\") pod \"route-controller-manager-574484fcb5-frxqq\" (UID: \"75dffdea-9f1c-4648-8a59-ed96c2890b61\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.030718 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de5841b7-3157-4bac-823b-d94d72103524-config\") pod \"controller-manager-6dd6bcf659-wj8wd\" (UID: \"de5841b7-3157-4bac-823b-d94d72103524\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.030852 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75dffdea-9f1c-4648-8a59-ed96c2890b61-serving-cert\") pod \"route-controller-manager-574484fcb5-frxqq\" (UID: \"75dffdea-9f1c-4648-8a59-ed96c2890b61\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.030877 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de53be86-6678-489f-9308-9379267f3295-catalog-content\") pod \"redhat-operators-gp6tz\" (UID: \"de53be86-6678-489f-9308-9379267f3295\") " pod="openshift-marketplace/redhat-operators-gp6tz" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.030913 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de5841b7-3157-4bac-823b-d94d72103524-serving-cert\") pod \"controller-manager-6dd6bcf659-wj8wd\" (UID: \"de5841b7-3157-4bac-823b-d94d72103524\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.030935 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bc2n\" (UniqueName: \"kubernetes.io/projected/75dffdea-9f1c-4648-8a59-ed96c2890b61-kube-api-access-7bc2n\") pod \"route-controller-manager-574484fcb5-frxqq\" (UID: \"75dffdea-9f1c-4648-8a59-ed96c2890b61\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.030957 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de5841b7-3157-4bac-823b-d94d72103524-proxy-ca-bundles\") pod \"controller-manager-6dd6bcf659-wj8wd\" (UID: \"de5841b7-3157-4bac-823b-d94d72103524\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.030991 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7zmp\" (UniqueName: \"kubernetes.io/projected/de53be86-6678-489f-9308-9379267f3295-kube-api-access-s7zmp\") pod \"redhat-operators-gp6tz\" (UID: \"de53be86-6678-489f-9308-9379267f3295\") " pod="openshift-marketplace/redhat-operators-gp6tz" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.031020 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkmgz\" (UniqueName: \"kubernetes.io/projected/de5841b7-3157-4bac-823b-d94d72103524-kube-api-access-vkmgz\") pod \"controller-manager-6dd6bcf659-wj8wd\" (UID: \"de5841b7-3157-4bac-823b-d94d72103524\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.031079 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75dffdea-9f1c-4648-8a59-ed96c2890b61-config\") pod \"route-controller-manager-574484fcb5-frxqq\" (UID: \"75dffdea-9f1c-4648-8a59-ed96c2890b61\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.031118 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de5841b7-3157-4bac-823b-d94d72103524-client-ca\") pod \"controller-manager-6dd6bcf659-wj8wd\" (UID: \"de5841b7-3157-4bac-823b-d94d72103524\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.032349 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de5841b7-3157-4bac-823b-d94d72103524-proxy-ca-bundles\") pod \"controller-manager-6dd6bcf659-wj8wd\" (UID: \"de5841b7-3157-4bac-823b-d94d72103524\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.033071 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75dffdea-9f1c-4648-8a59-ed96c2890b61-config\") pod \"route-controller-manager-574484fcb5-frxqq\" (UID: \"75dffdea-9f1c-4648-8a59-ed96c2890b61\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.034756 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75dffdea-9f1c-4648-8a59-ed96c2890b61-serving-cert\") pod \"route-controller-manager-574484fcb5-frxqq\" (UID: \"75dffdea-9f1c-4648-8a59-ed96c2890b61\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.035669 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de5841b7-3157-4bac-823b-d94d72103524-serving-cert\") pod \"controller-manager-6dd6bcf659-wj8wd\" (UID: \"de5841b7-3157-4bac-823b-d94d72103524\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.036065 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de5841b7-3157-4bac-823b-d94d72103524-client-ca\") pod \"controller-manager-6dd6bcf659-wj8wd\" (UID: \"de5841b7-3157-4bac-823b-d94d72103524\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.036477 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7k4c" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.051131 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkmgz\" (UniqueName: \"kubernetes.io/projected/de5841b7-3157-4bac-823b-d94d72103524-kube-api-access-vkmgz\") pod \"controller-manager-6dd6bcf659-wj8wd\" (UID: \"de5841b7-3157-4bac-823b-d94d72103524\") " pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.055788 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bc2n\" (UniqueName: \"kubernetes.io/projected/75dffdea-9f1c-4648-8a59-ed96c2890b61-kube-api-access-7bc2n\") pod \"route-controller-manager-574484fcb5-frxqq\" (UID: \"75dffdea-9f1c-4648-8a59-ed96c2890b61\") " pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.068731 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.074136 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.131789 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de53be86-6678-489f-9308-9379267f3295-utilities\") pod \"redhat-operators-gp6tz\" (UID: \"de53be86-6678-489f-9308-9379267f3295\") " pod="openshift-marketplace/redhat-operators-gp6tz" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.131857 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de53be86-6678-489f-9308-9379267f3295-catalog-content\") pod \"redhat-operators-gp6tz\" (UID: \"de53be86-6678-489f-9308-9379267f3295\") " pod="openshift-marketplace/redhat-operators-gp6tz" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.131888 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7zmp\" (UniqueName: \"kubernetes.io/projected/de53be86-6678-489f-9308-9379267f3295-kube-api-access-s7zmp\") pod \"redhat-operators-gp6tz\" (UID: \"de53be86-6678-489f-9308-9379267f3295\") " pod="openshift-marketplace/redhat-operators-gp6tz" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.132654 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de53be86-6678-489f-9308-9379267f3295-utilities\") pod \"redhat-operators-gp6tz\" (UID: \"de53be86-6678-489f-9308-9379267f3295\") " pod="openshift-marketplace/redhat-operators-gp6tz" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.133722 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de53be86-6678-489f-9308-9379267f3295-catalog-content\") pod \"redhat-operators-gp6tz\" (UID: \"de53be86-6678-489f-9308-9379267f3295\") " pod="openshift-marketplace/redhat-operators-gp6tz" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.150094 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7zmp\" (UniqueName: \"kubernetes.io/projected/de53be86-6678-489f-9308-9379267f3295-kube-api-access-s7zmp\") pod \"redhat-operators-gp6tz\" (UID: \"de53be86-6678-489f-9308-9379267f3295\") " pod="openshift-marketplace/redhat-operators-gp6tz" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.199519 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp6tz" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.476621 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7k4c"] Jan 31 09:31:36 crc kubenswrapper[4992]: W0131 09:31:36.486513 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26e0ae4d_f1ed_4368_a7b9_b2273ab80827.slice/crio-f25ded8c9bd342cf4bc7f61e225f53438846fd9102874e8f319029573afe22b9 WatchSource:0}: Error finding container f25ded8c9bd342cf4bc7f61e225f53438846fd9102874e8f319029573afe22b9: Status 404 returned error can't find the container with id f25ded8c9bd342cf4bc7f61e225f53438846fd9102874e8f319029573afe22b9 Jan 31 09:31:36 crc kubenswrapper[4992]: W0131 09:31:36.540573 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75dffdea_9f1c_4648_8a59_ed96c2890b61.slice/crio-f6ed40bc9bd8e6bbf3c36f5f7d96c1c897bed41da8a7f15a19894cb7db7b796f WatchSource:0}: Error finding container f6ed40bc9bd8e6bbf3c36f5f7d96c1c897bed41da8a7f15a19894cb7db7b796f: Status 404 returned error can't find the container with id f6ed40bc9bd8e6bbf3c36f5f7d96c1c897bed41da8a7f15a19894cb7db7b796f Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.540917 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd"] Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.544220 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq"] Jan 31 09:31:36 crc kubenswrapper[4992]: W0131 09:31:36.559842 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde5841b7_3157_4bac_823b_d94d72103524.slice/crio-ab62151840d5e545c96fda428262606fb6822dc72cec89b83f0d64572cf56295 WatchSource:0}: Error finding container ab62151840d5e545c96fda428262606fb6822dc72cec89b83f0d64572cf56295: Status 404 returned error can't find the container with id ab62151840d5e545c96fda428262606fb6822dc72cec89b83f0d64572cf56295 Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.628346 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gp6tz"] Jan 31 09:31:36 crc kubenswrapper[4992]: W0131 09:31:36.638166 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde53be86_6678_489f_9308_9379267f3295.slice/crio-476c3d6ed9b77b03ea768e8ae7195594142b0b006bdbed64545d5e2271d19b5d WatchSource:0}: Error finding container 476c3d6ed9b77b03ea768e8ae7195594142b0b006bdbed64545d5e2271d19b5d: Status 404 returned error can't find the container with id 476c3d6ed9b77b03ea768e8ae7195594142b0b006bdbed64545d5e2271d19b5d Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.759816 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp6tz" event={"ID":"de53be86-6678-489f-9308-9379267f3295","Type":"ContainerStarted","Data":"048a95ee6c4995a00a1689ec6d68fbaa240eaf3cc65dcc872739d8e386bba355"} Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.760060 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp6tz" event={"ID":"de53be86-6678-489f-9308-9379267f3295","Type":"ContainerStarted","Data":"476c3d6ed9b77b03ea768e8ae7195594142b0b006bdbed64545d5e2271d19b5d"} Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.761326 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" event={"ID":"75dffdea-9f1c-4648-8a59-ed96c2890b61","Type":"ContainerStarted","Data":"264e9a1ae8f3374a58ef79e0f9b5e6185f85eaf8c7351b2851fb87b218fb26c1"} Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.761374 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" event={"ID":"75dffdea-9f1c-4648-8a59-ed96c2890b61","Type":"ContainerStarted","Data":"f6ed40bc9bd8e6bbf3c36f5f7d96c1c897bed41da8a7f15a19894cb7db7b796f"} Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.762035 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.764191 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" event={"ID":"de5841b7-3157-4bac-823b-d94d72103524","Type":"ContainerStarted","Data":"04581c9c547169df3e5809c06dae4850af7a4ee30e6616b3ae2d40cc54b1c8e6"} Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.764258 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" event={"ID":"de5841b7-3157-4bac-823b-d94d72103524","Type":"ContainerStarted","Data":"ab62151840d5e545c96fda428262606fb6822dc72cec89b83f0d64572cf56295"} Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.764281 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.765601 4992 patch_prober.go:28] interesting pod/controller-manager-6dd6bcf659-wj8wd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": dial tcp 10.217.0.72:8443: connect: connection refused" start-of-body= Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.765642 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" podUID="de5841b7-3157-4bac-823b-d94d72103524" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": dial tcp 10.217.0.72:8443: connect: connection refused" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.768334 4992 generic.go:334] "Generic (PLEG): container finished" podID="ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977" containerID="38f2a31b41378ed85b3b0f310f2097bebd07d388c2d26ac2a1328d1f3a6b421f" exitCode=0 Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.768401 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgmqt" event={"ID":"ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977","Type":"ContainerDied","Data":"38f2a31b41378ed85b3b0f310f2097bebd07d388c2d26ac2a1328d1f3a6b421f"} Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.770243 4992 generic.go:334] "Generic (PLEG): container finished" podID="26e0ae4d-f1ed-4368-a7b9-b2273ab80827" containerID="70dcdadb163743b7bb8fe043f88c33eeb3298ec084a5530437b0d277f010f055" exitCode=0 Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.770280 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7k4c" event={"ID":"26e0ae4d-f1ed-4368-a7b9-b2273ab80827","Type":"ContainerDied","Data":"70dcdadb163743b7bb8fe043f88c33eeb3298ec084a5530437b0d277f010f055"} Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.770303 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7k4c" event={"ID":"26e0ae4d-f1ed-4368-a7b9-b2273ab80827","Type":"ContainerStarted","Data":"f25ded8c9bd342cf4bc7f61e225f53438846fd9102874e8f319029573afe22b9"} Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.838220 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" podStartSLOduration=2.83820221 podStartE2EDuration="2.83820221s" podCreationTimestamp="2026-01-31 09:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:31:36.837451167 +0000 UTC m=+392.808843174" watchObservedRunningTime="2026-01-31 09:31:36.83820221 +0000 UTC m=+392.809594197" Jan 31 09:31:36 crc kubenswrapper[4992]: I0131 09:31:36.859114 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" podStartSLOduration=2.859095711 podStartE2EDuration="2.859095711s" podCreationTimestamp="2026-01-31 09:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:31:36.858114552 +0000 UTC m=+392.829506559" watchObservedRunningTime="2026-01-31 09:31:36.859095711 +0000 UTC m=+392.830487698" Jan 31 09:31:37 crc kubenswrapper[4992]: I0131 09:31:37.446478 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-574484fcb5-frxqq" Jan 31 09:31:37 crc kubenswrapper[4992]: I0131 09:31:37.778133 4992 generic.go:334] "Generic (PLEG): container finished" podID="de53be86-6678-489f-9308-9379267f3295" containerID="048a95ee6c4995a00a1689ec6d68fbaa240eaf3cc65dcc872739d8e386bba355" exitCode=0 Jan 31 09:31:37 crc kubenswrapper[4992]: I0131 09:31:37.778187 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp6tz" event={"ID":"de53be86-6678-489f-9308-9379267f3295","Type":"ContainerDied","Data":"048a95ee6c4995a00a1689ec6d68fbaa240eaf3cc65dcc872739d8e386bba355"} Jan 31 09:31:37 crc kubenswrapper[4992]: I0131 09:31:37.785330 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zs9s" event={"ID":"265a14af-f30c-46a1-9618-e3b1e406f841","Type":"ContainerStarted","Data":"f78bdb6280600702acb73b65ffd97ad7f0433ec54fae12d38a1d03d616e9381f"} Jan 31 09:31:37 crc kubenswrapper[4992]: I0131 09:31:37.802502 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6dd6bcf659-wj8wd" Jan 31 09:31:38 crc kubenswrapper[4992]: I0131 09:31:38.792353 4992 generic.go:334] "Generic (PLEG): container finished" podID="265a14af-f30c-46a1-9618-e3b1e406f841" containerID="f78bdb6280600702acb73b65ffd97ad7f0433ec54fae12d38a1d03d616e9381f" exitCode=0 Jan 31 09:31:38 crc kubenswrapper[4992]: I0131 09:31:38.792466 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zs9s" event={"ID":"265a14af-f30c-46a1-9618-e3b1e406f841","Type":"ContainerDied","Data":"f78bdb6280600702acb73b65ffd97ad7f0433ec54fae12d38a1d03d616e9381f"} Jan 31 09:31:38 crc kubenswrapper[4992]: I0131 09:31:38.795592 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgmqt" event={"ID":"ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977","Type":"ContainerStarted","Data":"0ef274e00fc6a76cdd8e06ef714ed9ec2ef9a4242eb4f28c5067017d62a1ef48"} Jan 31 09:31:39 crc kubenswrapper[4992]: I0131 09:31:39.812567 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7k4c" event={"ID":"26e0ae4d-f1ed-4368-a7b9-b2273ab80827","Type":"ContainerStarted","Data":"c2ca1a6417e30467122fc45c930d7fb18c94020cf2270bd727108ab50fedb5cd"} Jan 31 09:31:40 crc kubenswrapper[4992]: I0131 09:31:40.818892 4992 generic.go:334] "Generic (PLEG): container finished" podID="26e0ae4d-f1ed-4368-a7b9-b2273ab80827" containerID="c2ca1a6417e30467122fc45c930d7fb18c94020cf2270bd727108ab50fedb5cd" exitCode=0 Jan 31 09:31:40 crc kubenswrapper[4992]: I0131 09:31:40.818956 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7k4c" event={"ID":"26e0ae4d-f1ed-4368-a7b9-b2273ab80827","Type":"ContainerDied","Data":"c2ca1a6417e30467122fc45c930d7fb18c94020cf2270bd727108ab50fedb5cd"} Jan 31 09:31:40 crc kubenswrapper[4992]: I0131 09:31:40.841885 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fgmqt" podStartSLOduration=4.84104496 podStartE2EDuration="7.841849717s" podCreationTimestamp="2026-01-31 09:31:33 +0000 UTC" firstStartedPulling="2026-01-31 09:31:34.73131648 +0000 UTC m=+390.702708467" lastFinishedPulling="2026-01-31 09:31:37.732121237 +0000 UTC m=+393.703513224" observedRunningTime="2026-01-31 09:31:38.835016244 +0000 UTC m=+394.806408251" watchObservedRunningTime="2026-01-31 09:31:40.841849717 +0000 UTC m=+396.813241764" Jan 31 09:31:41 crc kubenswrapper[4992]: I0131 09:31:41.826752 4992 generic.go:334] "Generic (PLEG): container finished" podID="de53be86-6678-489f-9308-9379267f3295" containerID="1310707746dbe17e0e0c02d4da4a0debdcdbb1391e5fff971973b4dcda9f417f" exitCode=0 Jan 31 09:31:41 crc kubenswrapper[4992]: I0131 09:31:41.826890 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp6tz" event={"ID":"de53be86-6678-489f-9308-9379267f3295","Type":"ContainerDied","Data":"1310707746dbe17e0e0c02d4da4a0debdcdbb1391e5fff971973b4dcda9f417f"} Jan 31 09:31:42 crc kubenswrapper[4992]: I0131 09:31:42.835203 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zs9s" event={"ID":"265a14af-f30c-46a1-9618-e3b1e406f841","Type":"ContainerStarted","Data":"0f179d239a1cf97cc96f5446f9e2130133c9967a42e02c2f88c503bae64e8b8b"} Jan 31 09:31:42 crc kubenswrapper[4992]: I0131 09:31:42.861053 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8zs9s" podStartSLOduration=3.621456929 podStartE2EDuration="9.861033447s" podCreationTimestamp="2026-01-31 09:31:33 +0000 UTC" firstStartedPulling="2026-01-31 09:31:34.729006931 +0000 UTC m=+390.700398918" lastFinishedPulling="2026-01-31 09:31:40.968583449 +0000 UTC m=+396.939975436" observedRunningTime="2026-01-31 09:31:42.859293835 +0000 UTC m=+398.830685842" watchObservedRunningTime="2026-01-31 09:31:42.861033447 +0000 UTC m=+398.832425444" Jan 31 09:31:43 crc kubenswrapper[4992]: I0131 09:31:43.594937 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8zs9s" Jan 31 09:31:43 crc kubenswrapper[4992]: I0131 09:31:43.595394 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8zs9s" Jan 31 09:31:43 crc kubenswrapper[4992]: I0131 09:31:43.791656 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fgmqt" Jan 31 09:31:43 crc kubenswrapper[4992]: I0131 09:31:43.792082 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fgmqt" Jan 31 09:31:43 crc kubenswrapper[4992]: I0131 09:31:43.829555 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fgmqt" Jan 31 09:31:43 crc kubenswrapper[4992]: I0131 09:31:43.881960 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fgmqt" Jan 31 09:31:44 crc kubenswrapper[4992]: I0131 09:31:44.635191 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-8zs9s" podUID="265a14af-f30c-46a1-9618-e3b1e406f841" containerName="registry-server" probeResult="failure" output=< Jan 31 09:31:44 crc kubenswrapper[4992]: timeout: failed to connect service ":50051" within 1s Jan 31 09:31:44 crc kubenswrapper[4992]: > Jan 31 09:31:44 crc kubenswrapper[4992]: I0131 09:31:44.846096 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7k4c" event={"ID":"26e0ae4d-f1ed-4368-a7b9-b2273ab80827","Type":"ContainerStarted","Data":"602bb5db76fa83a18513474056476e7ae6499edf3d7f70cc1dfaab806d8e5b90"} Jan 31 09:31:45 crc kubenswrapper[4992]: I0131 09:31:45.301128 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:31:45 crc kubenswrapper[4992]: I0131 09:31:45.301236 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:31:45 crc kubenswrapper[4992]: I0131 09:31:45.872719 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v7k4c" podStartSLOduration=4.914676381 podStartE2EDuration="10.872690219s" podCreationTimestamp="2026-01-31 09:31:35 +0000 UTC" firstStartedPulling="2026-01-31 09:31:36.772222076 +0000 UTC m=+392.743614053" lastFinishedPulling="2026-01-31 09:31:42.730235904 +0000 UTC m=+398.701627891" observedRunningTime="2026-01-31 09:31:45.867613828 +0000 UTC m=+401.839005835" watchObservedRunningTime="2026-01-31 09:31:45.872690219 +0000 UTC m=+401.844082246" Jan 31 09:31:46 crc kubenswrapper[4992]: I0131 09:31:46.037529 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v7k4c" Jan 31 09:31:46 crc kubenswrapper[4992]: I0131 09:31:46.037593 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v7k4c" Jan 31 09:31:46 crc kubenswrapper[4992]: I0131 09:31:46.859187 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp6tz" event={"ID":"de53be86-6678-489f-9308-9379267f3295","Type":"ContainerStarted","Data":"b1c63e63d7e2ecf45be2b97eb306e518d8297989d33ca3e4223d6b163dcaa971"} Jan 31 09:31:47 crc kubenswrapper[4992]: I0131 09:31:47.096789 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-v7k4c" podUID="26e0ae4d-f1ed-4368-a7b9-b2273ab80827" containerName="registry-server" probeResult="failure" output=< Jan 31 09:31:47 crc kubenswrapper[4992]: timeout: failed to connect service ":50051" within 1s Jan 31 09:31:47 crc kubenswrapper[4992]: > Jan 31 09:31:48 crc kubenswrapper[4992]: I0131 09:31:48.901989 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gp6tz" podStartSLOduration=5.693413238 podStartE2EDuration="13.901972074s" podCreationTimestamp="2026-01-31 09:31:35 +0000 UTC" firstStartedPulling="2026-01-31 09:31:37.780004452 +0000 UTC m=+393.751396439" lastFinishedPulling="2026-01-31 09:31:45.988563288 +0000 UTC m=+401.959955275" observedRunningTime="2026-01-31 09:31:48.896831391 +0000 UTC m=+404.868223408" watchObservedRunningTime="2026-01-31 09:31:48.901972074 +0000 UTC m=+404.873364061" Jan 31 09:31:49 crc kubenswrapper[4992]: I0131 09:31:49.766152 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" podUID="754c2a0a-7622-4316-9706-e8499dd756a5" containerName="registry" containerID="cri-o://f50160ea56130f69a407cde35580a684c293b1eb690803fa21132cae1e58d7f7" gracePeriod=30 Jan 31 09:31:50 crc kubenswrapper[4992]: I0131 09:31:50.883259 4992 generic.go:334] "Generic (PLEG): container finished" podID="754c2a0a-7622-4316-9706-e8499dd756a5" containerID="f50160ea56130f69a407cde35580a684c293b1eb690803fa21132cae1e58d7f7" exitCode=0 Jan 31 09:31:50 crc kubenswrapper[4992]: I0131 09:31:50.883313 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" event={"ID":"754c2a0a-7622-4316-9706-e8499dd756a5","Type":"ContainerDied","Data":"f50160ea56130f69a407cde35580a684c293b1eb690803fa21132cae1e58d7f7"} Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.744681 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.805139 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w459n\" (UniqueName: \"kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-kube-api-access-w459n\") pod \"754c2a0a-7622-4316-9706-e8499dd756a5\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.805186 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/754c2a0a-7622-4316-9706-e8499dd756a5-ca-trust-extracted\") pod \"754c2a0a-7622-4316-9706-e8499dd756a5\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.805228 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/754c2a0a-7622-4316-9706-e8499dd756a5-installation-pull-secrets\") pod \"754c2a0a-7622-4316-9706-e8499dd756a5\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.822064 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"754c2a0a-7622-4316-9706-e8499dd756a5\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.822126 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-bound-sa-token\") pod \"754c2a0a-7622-4316-9706-e8499dd756a5\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.822159 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/754c2a0a-7622-4316-9706-e8499dd756a5-registry-certificates\") pod \"754c2a0a-7622-4316-9706-e8499dd756a5\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.822196 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-registry-tls\") pod \"754c2a0a-7622-4316-9706-e8499dd756a5\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.822220 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/754c2a0a-7622-4316-9706-e8499dd756a5-trusted-ca\") pod \"754c2a0a-7622-4316-9706-e8499dd756a5\" (UID: \"754c2a0a-7622-4316-9706-e8499dd756a5\") " Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.822625 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/754c2a0a-7622-4316-9706-e8499dd756a5-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "754c2a0a-7622-4316-9706-e8499dd756a5" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.822927 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/754c2a0a-7622-4316-9706-e8499dd756a5-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "754c2a0a-7622-4316-9706-e8499dd756a5" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.823296 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/754c2a0a-7622-4316-9706-e8499dd756a5-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "754c2a0a-7622-4316-9706-e8499dd756a5" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.830756 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-kube-api-access-w459n" (OuterVolumeSpecName: "kube-api-access-w459n") pod "754c2a0a-7622-4316-9706-e8499dd756a5" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5"). InnerVolumeSpecName "kube-api-access-w459n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.831697 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "754c2a0a-7622-4316-9706-e8499dd756a5" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.832169 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "754c2a0a-7622-4316-9706-e8499dd756a5" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.839844 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/754c2a0a-7622-4316-9706-e8499dd756a5-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "754c2a0a-7622-4316-9706-e8499dd756a5" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.845669 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "754c2a0a-7622-4316-9706-e8499dd756a5" (UID: "754c2a0a-7622-4316-9706-e8499dd756a5"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.894001 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" event={"ID":"754c2a0a-7622-4316-9706-e8499dd756a5","Type":"ContainerDied","Data":"ad60edf821e1dcc0f27ebe4cb2fcb3c7fd7e4dec9e8b35b696ef5c5fb0ba5de9"} Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.894052 4992 scope.go:117] "RemoveContainer" containerID="f50160ea56130f69a407cde35580a684c293b1eb690803fa21132cae1e58d7f7" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.894081 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-j6dj7" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.924375 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-j6dj7"] Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.924382 4992 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/754c2a0a-7622-4316-9706-e8499dd756a5-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.924493 4992 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/754c2a0a-7622-4316-9706-e8499dd756a5-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.924509 4992 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.924524 4992 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/754c2a0a-7622-4316-9706-e8499dd756a5-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.924539 4992 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.924552 4992 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/754c2a0a-7622-4316-9706-e8499dd756a5-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.924563 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w459n\" (UniqueName: \"kubernetes.io/projected/754c2a0a-7622-4316-9706-e8499dd756a5-kube-api-access-w459n\") on node \"crc\" DevicePath \"\"" Jan 31 09:31:52 crc kubenswrapper[4992]: I0131 09:31:52.928287 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-j6dj7"] Jan 31 09:31:53 crc kubenswrapper[4992]: I0131 09:31:53.189288 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="754c2a0a-7622-4316-9706-e8499dd756a5" path="/var/lib/kubelet/pods/754c2a0a-7622-4316-9706-e8499dd756a5/volumes" Jan 31 09:31:53 crc kubenswrapper[4992]: I0131 09:31:53.651154 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8zs9s" Jan 31 09:31:53 crc kubenswrapper[4992]: I0131 09:31:53.704922 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8zs9s" Jan 31 09:31:56 crc kubenswrapper[4992]: I0131 09:31:56.115800 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v7k4c" Jan 31 09:31:56 crc kubenswrapper[4992]: I0131 09:31:56.160577 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v7k4c" Jan 31 09:31:56 crc kubenswrapper[4992]: I0131 09:31:56.200482 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gp6tz" Jan 31 09:31:56 crc kubenswrapper[4992]: I0131 09:31:56.200525 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gp6tz" Jan 31 09:31:56 crc kubenswrapper[4992]: I0131 09:31:56.239992 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gp6tz" Jan 31 09:31:56 crc kubenswrapper[4992]: I0131 09:31:56.961326 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gp6tz" Jan 31 09:32:15 crc kubenswrapper[4992]: I0131 09:32:15.300776 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:32:15 crc kubenswrapper[4992]: I0131 09:32:15.301337 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:32:15 crc kubenswrapper[4992]: I0131 09:32:15.301396 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:32:15 crc kubenswrapper[4992]: I0131 09:32:15.302267 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"316b87cd3c7723d9291c5891a182a9cf97966bbee250a4d2b5a93c61c18b536c"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:32:15 crc kubenswrapper[4992]: I0131 09:32:15.302381 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://316b87cd3c7723d9291c5891a182a9cf97966bbee250a4d2b5a93c61c18b536c" gracePeriod=600 Jan 31 09:32:16 crc kubenswrapper[4992]: I0131 09:32:16.026987 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="316b87cd3c7723d9291c5891a182a9cf97966bbee250a4d2b5a93c61c18b536c" exitCode=0 Jan 31 09:32:16 crc kubenswrapper[4992]: I0131 09:32:16.027108 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"316b87cd3c7723d9291c5891a182a9cf97966bbee250a4d2b5a93c61c18b536c"} Jan 31 09:32:16 crc kubenswrapper[4992]: I0131 09:32:16.027663 4992 scope.go:117] "RemoveContainer" containerID="afeb407015e9652521bee55cbb82a50a6e097a489ce1638dc56464b511c20af0" Jan 31 09:32:17 crc kubenswrapper[4992]: I0131 09:32:17.038263 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"985e7c05e12ab986daeed0403a8d03b948841da13cf4731f55f5b3f5414e74ac"} Jan 31 09:34:45 crc kubenswrapper[4992]: I0131 09:34:45.301109 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:34:45 crc kubenswrapper[4992]: I0131 09:34:45.301737 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:35:15 crc kubenswrapper[4992]: I0131 09:35:15.301317 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:35:15 crc kubenswrapper[4992]: I0131 09:35:15.301853 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:35:45 crc kubenswrapper[4992]: I0131 09:35:45.301228 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:35:45 crc kubenswrapper[4992]: I0131 09:35:45.302071 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:35:45 crc kubenswrapper[4992]: I0131 09:35:45.302132 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:35:45 crc kubenswrapper[4992]: I0131 09:35:45.303108 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"985e7c05e12ab986daeed0403a8d03b948841da13cf4731f55f5b3f5414e74ac"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:35:45 crc kubenswrapper[4992]: I0131 09:35:45.303280 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://985e7c05e12ab986daeed0403a8d03b948841da13cf4731f55f5b3f5414e74ac" gracePeriod=600 Jan 31 09:35:46 crc kubenswrapper[4992]: I0131 09:35:46.226013 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="985e7c05e12ab986daeed0403a8d03b948841da13cf4731f55f5b3f5414e74ac" exitCode=0 Jan 31 09:35:46 crc kubenswrapper[4992]: I0131 09:35:46.226101 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"985e7c05e12ab986daeed0403a8d03b948841da13cf4731f55f5b3f5414e74ac"} Jan 31 09:35:46 crc kubenswrapper[4992]: I0131 09:35:46.226357 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"dcfc3dfd610126640e5e892b66ddc4eb9fe55b9da0dd6e87ee131f4f08a55e7c"} Jan 31 09:35:46 crc kubenswrapper[4992]: I0131 09:35:46.226385 4992 scope.go:117] "RemoveContainer" containerID="316b87cd3c7723d9291c5891a182a9cf97966bbee250a4d2b5a93c61c18b536c" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.572438 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-7lhxv"] Jan 31 09:37:38 crc kubenswrapper[4992]: E0131 09:37:38.573134 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="754c2a0a-7622-4316-9706-e8499dd756a5" containerName="registry" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.573222 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="754c2a0a-7622-4316-9706-e8499dd756a5" containerName="registry" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.573339 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="754c2a0a-7622-4316-9706-e8499dd756a5" containerName="registry" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.573700 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-7lhxv" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.582728 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.582801 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.582808 4992 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-srxcz" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.582876 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-ckw79"] Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.583668 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-ckw79" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.589792 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-7lhxv"] Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.597491 4992 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ch5ld" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.602068 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-69l5h"] Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.602968 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-69l5h" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.605002 4992 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-s4vfr" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.619625 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-ckw79"] Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.629340 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-69l5h"] Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.694444 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6xpt\" (UniqueName: \"kubernetes.io/projected/e16aae74-abb9-4397-a7f7-a4ce1e5d88ae-kube-api-access-h6xpt\") pod \"cert-manager-cainjector-cf98fcc89-7lhxv\" (UID: \"e16aae74-abb9-4397-a7f7-a4ce1e5d88ae\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-7lhxv" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.694559 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xbcq\" (UniqueName: \"kubernetes.io/projected/6804c6e1-c251-40e6-aef2-7c5034b576aa-kube-api-access-8xbcq\") pod \"cert-manager-webhook-687f57d79b-69l5h\" (UID: \"6804c6e1-c251-40e6-aef2-7c5034b576aa\") " pod="cert-manager/cert-manager-webhook-687f57d79b-69l5h" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.694666 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l5m9\" (UniqueName: \"kubernetes.io/projected/4a7adcb8-e515-4542-be62-9dbc5de42601-kube-api-access-2l5m9\") pod \"cert-manager-858654f9db-ckw79\" (UID: \"4a7adcb8-e515-4542-be62-9dbc5de42601\") " pod="cert-manager/cert-manager-858654f9db-ckw79" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.796404 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6xpt\" (UniqueName: \"kubernetes.io/projected/e16aae74-abb9-4397-a7f7-a4ce1e5d88ae-kube-api-access-h6xpt\") pod \"cert-manager-cainjector-cf98fcc89-7lhxv\" (UID: \"e16aae74-abb9-4397-a7f7-a4ce1e5d88ae\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-7lhxv" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.796598 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xbcq\" (UniqueName: \"kubernetes.io/projected/6804c6e1-c251-40e6-aef2-7c5034b576aa-kube-api-access-8xbcq\") pod \"cert-manager-webhook-687f57d79b-69l5h\" (UID: \"6804c6e1-c251-40e6-aef2-7c5034b576aa\") " pod="cert-manager/cert-manager-webhook-687f57d79b-69l5h" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.796631 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l5m9\" (UniqueName: \"kubernetes.io/projected/4a7adcb8-e515-4542-be62-9dbc5de42601-kube-api-access-2l5m9\") pod \"cert-manager-858654f9db-ckw79\" (UID: \"4a7adcb8-e515-4542-be62-9dbc5de42601\") " pod="cert-manager/cert-manager-858654f9db-ckw79" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.814218 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6xpt\" (UniqueName: \"kubernetes.io/projected/e16aae74-abb9-4397-a7f7-a4ce1e5d88ae-kube-api-access-h6xpt\") pod \"cert-manager-cainjector-cf98fcc89-7lhxv\" (UID: \"e16aae74-abb9-4397-a7f7-a4ce1e5d88ae\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-7lhxv" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.816291 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l5m9\" (UniqueName: \"kubernetes.io/projected/4a7adcb8-e515-4542-be62-9dbc5de42601-kube-api-access-2l5m9\") pod \"cert-manager-858654f9db-ckw79\" (UID: \"4a7adcb8-e515-4542-be62-9dbc5de42601\") " pod="cert-manager/cert-manager-858654f9db-ckw79" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.816686 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xbcq\" (UniqueName: \"kubernetes.io/projected/6804c6e1-c251-40e6-aef2-7c5034b576aa-kube-api-access-8xbcq\") pod \"cert-manager-webhook-687f57d79b-69l5h\" (UID: \"6804c6e1-c251-40e6-aef2-7c5034b576aa\") " pod="cert-manager/cert-manager-webhook-687f57d79b-69l5h" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.895282 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-7lhxv" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.902083 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-ckw79" Jan 31 09:37:38 crc kubenswrapper[4992]: I0131 09:37:38.916108 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-69l5h" Jan 31 09:37:39 crc kubenswrapper[4992]: I0131 09:37:39.236764 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-69l5h"] Jan 31 09:37:39 crc kubenswrapper[4992]: I0131 09:37:39.254490 4992 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:37:39 crc kubenswrapper[4992]: I0131 09:37:39.366145 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-ckw79"] Jan 31 09:37:39 crc kubenswrapper[4992]: I0131 09:37:39.370550 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-7lhxv"] Jan 31 09:37:39 crc kubenswrapper[4992]: W0131 09:37:39.372408 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a7adcb8_e515_4542_be62_9dbc5de42601.slice/crio-6b9a12a360db1b5e838feadf70d553e2d5bc7b0e1ecaa755c1bccfaffc938038 WatchSource:0}: Error finding container 6b9a12a360db1b5e838feadf70d553e2d5bc7b0e1ecaa755c1bccfaffc938038: Status 404 returned error can't find the container with id 6b9a12a360db1b5e838feadf70d553e2d5bc7b0e1ecaa755c1bccfaffc938038 Jan 31 09:37:39 crc kubenswrapper[4992]: W0131 09:37:39.377725 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode16aae74_abb9_4397_a7f7_a4ce1e5d88ae.slice/crio-8994018486d930049bdb09bd92f16699016417d27c62264b4e10fb8fb22bc302 WatchSource:0}: Error finding container 8994018486d930049bdb09bd92f16699016417d27c62264b4e10fb8fb22bc302: Status 404 returned error can't find the container with id 8994018486d930049bdb09bd92f16699016417d27c62264b4e10fb8fb22bc302 Jan 31 09:37:39 crc kubenswrapper[4992]: I0131 09:37:39.832553 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-69l5h" event={"ID":"6804c6e1-c251-40e6-aef2-7c5034b576aa","Type":"ContainerStarted","Data":"28697b283c4371ef19e6f340bc72f5df5e45fc0b818e35bd1dd36dd804a95e95"} Jan 31 09:37:39 crc kubenswrapper[4992]: I0131 09:37:39.834437 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-ckw79" event={"ID":"4a7adcb8-e515-4542-be62-9dbc5de42601","Type":"ContainerStarted","Data":"6b9a12a360db1b5e838feadf70d553e2d5bc7b0e1ecaa755c1bccfaffc938038"} Jan 31 09:37:39 crc kubenswrapper[4992]: I0131 09:37:39.835989 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-7lhxv" event={"ID":"e16aae74-abb9-4397-a7f7-a4ce1e5d88ae","Type":"ContainerStarted","Data":"8994018486d930049bdb09bd92f16699016417d27c62264b4e10fb8fb22bc302"} Jan 31 09:37:41 crc kubenswrapper[4992]: I0131 09:37:41.210038 4992 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 31 09:37:43 crc kubenswrapper[4992]: I0131 09:37:43.869465 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-7lhxv" event={"ID":"e16aae74-abb9-4397-a7f7-a4ce1e5d88ae","Type":"ContainerStarted","Data":"13ba9f8bcb93ffaca9e6a376fd8055744d2a712169ec36e7c8ce3f73cf01c834"} Jan 31 09:37:43 crc kubenswrapper[4992]: I0131 09:37:43.872409 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-69l5h" event={"ID":"6804c6e1-c251-40e6-aef2-7c5034b576aa","Type":"ContainerStarted","Data":"347e7e64bf81a0ca26770cf903808f3930b7ce1d3e395be692cd88537b9149ec"} Jan 31 09:37:43 crc kubenswrapper[4992]: I0131 09:37:43.872682 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-69l5h" Jan 31 09:37:43 crc kubenswrapper[4992]: I0131 09:37:43.888995 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-7lhxv" podStartSLOduration=2.310950717 podStartE2EDuration="5.888972744s" podCreationTimestamp="2026-01-31 09:37:38 +0000 UTC" firstStartedPulling="2026-01-31 09:37:39.379458381 +0000 UTC m=+755.350850368" lastFinishedPulling="2026-01-31 09:37:42.957480408 +0000 UTC m=+758.928872395" observedRunningTime="2026-01-31 09:37:43.881290742 +0000 UTC m=+759.852682739" watchObservedRunningTime="2026-01-31 09:37:43.888972744 +0000 UTC m=+759.860364751" Jan 31 09:37:43 crc kubenswrapper[4992]: I0131 09:37:43.899535 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-69l5h" podStartSLOduration=2.137508994 podStartE2EDuration="5.899517059s" podCreationTimestamp="2026-01-31 09:37:38 +0000 UTC" firstStartedPulling="2026-01-31 09:37:39.254261762 +0000 UTC m=+755.225653749" lastFinishedPulling="2026-01-31 09:37:43.016269817 +0000 UTC m=+758.987661814" observedRunningTime="2026-01-31 09:37:43.898360266 +0000 UTC m=+759.869752283" watchObservedRunningTime="2026-01-31 09:37:43.899517059 +0000 UTC m=+759.870909046" Jan 31 09:37:44 crc kubenswrapper[4992]: I0131 09:37:44.878946 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-ckw79" event={"ID":"4a7adcb8-e515-4542-be62-9dbc5de42601","Type":"ContainerStarted","Data":"29e8a08131be5909a8fb2e045f2f219866365b8820b8fa3edee8ae7a6caf1113"} Jan 31 09:37:44 crc kubenswrapper[4992]: I0131 09:37:44.893938 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-ckw79" podStartSLOduration=1.7925119409999999 podStartE2EDuration="6.893898202s" podCreationTimestamp="2026-01-31 09:37:38 +0000 UTC" firstStartedPulling="2026-01-31 09:37:39.375680722 +0000 UTC m=+755.347072709" lastFinishedPulling="2026-01-31 09:37:44.477066973 +0000 UTC m=+760.448458970" observedRunningTime="2026-01-31 09:37:44.89104432 +0000 UTC m=+760.862436307" watchObservedRunningTime="2026-01-31 09:37:44.893898202 +0000 UTC m=+760.865290189" Jan 31 09:37:45 crc kubenswrapper[4992]: I0131 09:37:45.301317 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:37:45 crc kubenswrapper[4992]: I0131 09:37:45.301384 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.189925 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-46cdx"] Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.190689 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovn-controller" containerID="cri-o://24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed" gracePeriod=30 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.190778 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="nbdb" containerID="cri-o://0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d" gracePeriod=30 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.190836 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="kube-rbac-proxy-node" containerID="cri-o://d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0" gracePeriod=30 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.190807 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109" gracePeriod=30 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.190876 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovn-acl-logging" containerID="cri-o://d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e" gracePeriod=30 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.190966 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="northd" containerID="cri-o://11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1" gracePeriod=30 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.191079 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="sbdb" containerID="cri-o://c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511" gracePeriod=30 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.229161 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" containerID="cri-o://b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6" gracePeriod=30 Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.432110 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511 is running failed: container process not found" containerID="c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.432172 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d is running failed: container process not found" containerID="0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.432647 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511 is running failed: container process not found" containerID="c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.432680 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d is running failed: container process not found" containerID="0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.433064 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511 is running failed: container process not found" containerID="c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.433189 4992 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="sbdb" Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.433202 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d is running failed: container process not found" containerID="0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.433252 4992 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="nbdb" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.558709 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/3.log" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.561274 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovn-acl-logging/0.log" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.561724 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovn-controller/0.log" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.562326 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.617894 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tqpkq"] Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.618116 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618136 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.618150 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovn-acl-logging" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618158 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovn-acl-logging" Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.618167 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="kube-rbac-proxy-ovn-metrics" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618175 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="kube-rbac-proxy-ovn-metrics" Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.618189 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="sbdb" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618200 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="sbdb" Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.618213 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="kubecfg-setup" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618221 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="kubecfg-setup" Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.618234 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618241 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.618252 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618259 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.618271 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovn-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618277 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovn-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.618286 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618291 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.618298 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="kube-rbac-proxy-node" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618304 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="kube-rbac-proxy-node" Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.618316 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="nbdb" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618321 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="nbdb" Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.618329 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="northd" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618334 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="northd" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618475 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618485 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618493 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="kube-rbac-proxy-ovn-metrics" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618499 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618505 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovn-acl-logging" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618512 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="kube-rbac-proxy-node" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618519 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovn-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618527 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="nbdb" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618535 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="sbdb" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618542 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="northd" Jan 31 09:37:48 crc kubenswrapper[4992]: E0131 09:37:48.618627 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618635 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618721 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.618741 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" containerName="ovnkube-controller" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.620256 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629438 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-var-lib-cni-networks-ovn-kubernetes\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629481 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-log-socket\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629525 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-run-netns\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629548 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-openvswitch\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629575 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-slash\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629604 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-run-ovn-kubernetes\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629628 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-cni-bin\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629543 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629567 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-log-socket" (OuterVolumeSpecName: "log-socket") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629667 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-ovnkube-config\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629692 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-env-overrides\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629713 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-ovn\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629734 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-systemd\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629765 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-cni-netd\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629783 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-systemd-units\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629802 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-kubelet\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629826 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6939ca32-c541-41c0-ba96-4282b942ff16-ovn-node-metrics-cert\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629854 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-var-lib-openvswitch\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629879 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-etc-openvswitch\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629898 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-node-log\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629954 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dsg2\" (UniqueName: \"kubernetes.io/projected/6939ca32-c541-41c0-ba96-4282b942ff16-kube-api-access-2dsg2\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629981 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-ovnkube-script-lib\") pod \"6939ca32-c541-41c0-ba96-4282b942ff16\" (UID: \"6939ca32-c541-41c0-ba96-4282b942ff16\") " Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629588 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629602 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629621 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-slash" (OuterVolumeSpecName: "host-slash") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629640 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.629682 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630080 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630104 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630165 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630410 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-node-log" (OuterVolumeSpecName: "node-log") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630462 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630484 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630507 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630615 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630738 4992 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-node-log\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630755 4992 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630766 4992 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630774 4992 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-log-socket\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630782 4992 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630790 4992 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630800 4992 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-slash\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630808 4992 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630816 4992 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630825 4992 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630834 4992 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630841 4992 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630849 4992 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630856 4992 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.630864 4992 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.631197 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.631245 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.635817 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6939ca32-c541-41c0-ba96-4282b942ff16-kube-api-access-2dsg2" (OuterVolumeSpecName: "kube-api-access-2dsg2") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "kube-api-access-2dsg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.636139 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6939ca32-c541-41c0-ba96-4282b942ff16-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.648922 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "6939ca32-c541-41c0-ba96-4282b942ff16" (UID: "6939ca32-c541-41c0-ba96-4282b942ff16"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.731906 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-run-ovn-kubernetes\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.731970 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-run-systemd\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732031 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-run-openvswitch\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732059 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/31bc5549-6e44-46c0-9c49-9398e6311979-ovnkube-script-lib\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732099 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-run-ovn\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732125 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-log-socket\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732149 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-kubelet\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732170 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-etc-openvswitch\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732192 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-slash\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732217 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/31bc5549-6e44-46c0-9c49-9398e6311979-ovn-node-metrics-cert\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732241 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-systemd-units\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732263 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-var-lib-openvswitch\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732286 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-cni-bin\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732320 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-cni-netd\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732345 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/31bc5549-6e44-46c0-9c49-9398e6311979-env-overrides\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732363 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2t84\" (UniqueName: \"kubernetes.io/projected/31bc5549-6e44-46c0-9c49-9398e6311979-kube-api-access-w2t84\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732429 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-node-log\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732448 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732468 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-run-netns\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732482 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/31bc5549-6e44-46c0-9c49-9398e6311979-ovnkube-config\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732526 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dsg2\" (UniqueName: \"kubernetes.io/projected/6939ca32-c541-41c0-ba96-4282b942ff16-kube-api-access-2dsg2\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732540 4992 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732550 4992 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6939ca32-c541-41c0-ba96-4282b942ff16-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732559 4992 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6939ca32-c541-41c0-ba96-4282b942ff16-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.732568 4992 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6939ca32-c541-41c0-ba96-4282b942ff16-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.833869 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-systemd-units\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.833922 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-cni-bin\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.833940 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-var-lib-openvswitch\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.833958 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-cni-netd\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.833968 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-systemd-units\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834001 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-cni-bin\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.833979 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/31bc5549-6e44-46c0-9c49-9398e6311979-env-overrides\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834052 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-cni-netd\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834080 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2t84\" (UniqueName: \"kubernetes.io/projected/31bc5549-6e44-46c0-9c49-9398e6311979-kube-api-access-w2t84\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834052 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-var-lib-openvswitch\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834117 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-node-log\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834141 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834150 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-node-log\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834165 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-run-netns\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834194 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-run-netns\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834215 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/31bc5549-6e44-46c0-9c49-9398e6311979-ovnkube-config\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834224 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834288 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-run-ovn-kubernetes\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834319 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-run-systemd\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834352 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-run-ovn-kubernetes\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834375 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-run-openvswitch\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834397 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-run-systemd\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834413 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/31bc5549-6e44-46c0-9c49-9398e6311979-ovnkube-script-lib\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834448 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-run-openvswitch\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834488 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-run-ovn\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834536 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-log-socket\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834573 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-kubelet\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834592 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-etc-openvswitch\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834596 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-run-ovn\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834611 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/31bc5549-6e44-46c0-9c49-9398e6311979-env-overrides\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834616 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-slash\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.835689 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/31bc5549-6e44-46c0-9c49-9398e6311979-ovn-node-metrics-cert\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834634 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-kubelet\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834657 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-log-socket\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834662 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-etc-openvswitch\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.835080 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/31bc5549-6e44-46c0-9c49-9398e6311979-ovnkube-config\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.834641 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/31bc5549-6e44-46c0-9c49-9398e6311979-host-slash\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.835283 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/31bc5549-6e44-46c0-9c49-9398e6311979-ovnkube-script-lib\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.839828 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/31bc5549-6e44-46c0-9c49-9398e6311979-ovn-node-metrics-cert\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.849701 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2t84\" (UniqueName: \"kubernetes.io/projected/31bc5549-6e44-46c0-9c49-9398e6311979-kube-api-access-w2t84\") pod \"ovnkube-node-tqpkq\" (UID: \"31bc5549-6e44-46c0-9c49-9398e6311979\") " pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.900483 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovnkube-controller/3.log" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.902911 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovn-acl-logging/0.log" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.903753 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46cdx_6939ca32-c541-41c0-ba96-4282b942ff16/ovn-controller/0.log" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904127 4992 generic.go:334] "Generic (PLEG): container finished" podID="6939ca32-c541-41c0-ba96-4282b942ff16" containerID="b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6" exitCode=0 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904156 4992 generic.go:334] "Generic (PLEG): container finished" podID="6939ca32-c541-41c0-ba96-4282b942ff16" containerID="c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511" exitCode=0 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904165 4992 generic.go:334] "Generic (PLEG): container finished" podID="6939ca32-c541-41c0-ba96-4282b942ff16" containerID="0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d" exitCode=0 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904173 4992 generic.go:334] "Generic (PLEG): container finished" podID="6939ca32-c541-41c0-ba96-4282b942ff16" containerID="11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1" exitCode=0 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904180 4992 generic.go:334] "Generic (PLEG): container finished" podID="6939ca32-c541-41c0-ba96-4282b942ff16" containerID="1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109" exitCode=0 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904188 4992 generic.go:334] "Generic (PLEG): container finished" podID="6939ca32-c541-41c0-ba96-4282b942ff16" containerID="d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0" exitCode=0 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904195 4992 generic.go:334] "Generic (PLEG): container finished" podID="6939ca32-c541-41c0-ba96-4282b942ff16" containerID="d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e" exitCode=143 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904203 4992 generic.go:334] "Generic (PLEG): container finished" podID="6939ca32-c541-41c0-ba96-4282b942ff16" containerID="24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed" exitCode=143 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904216 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904250 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerDied","Data":"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904313 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerDied","Data":"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904335 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerDied","Data":"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904356 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerDied","Data":"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904379 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerDied","Data":"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904397 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerDied","Data":"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904440 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904461 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904475 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904487 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904498 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904509 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904519 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904530 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904540 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904555 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerDied","Data":"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904573 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904398 4992 scope.go:117] "RemoveContainer" containerID="b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904586 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904762 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904774 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904782 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904791 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904822 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904832 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904839 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904846 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904861 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerDied","Data":"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904876 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904912 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904920 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904927 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904934 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904941 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904949 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904956 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904984 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.904994 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.905006 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46cdx" event={"ID":"6939ca32-c541-41c0-ba96-4282b942ff16","Type":"ContainerDied","Data":"706a6d5527482a210ea1d90d8ea86c4f87a85094d0cdbc69b975098a24c405d9"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.905020 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.905090 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.905098 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.905106 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.905113 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.905120 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.905127 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.905133 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.905140 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.905203 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.908848 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bjplh_6bd42532-8655-4c14-991b-4cc36dea52d5/kube-multus/2.log" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.909662 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bjplh_6bd42532-8655-4c14-991b-4cc36dea52d5/kube-multus/1.log" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.909714 4992 generic.go:334] "Generic (PLEG): container finished" podID="6bd42532-8655-4c14-991b-4cc36dea52d5" containerID="6c56799f9d42ab763c18e23603d8d02dbeae5c0bc0167fb521c17f9dd9372a8a" exitCode=2 Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.909743 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bjplh" event={"ID":"6bd42532-8655-4c14-991b-4cc36dea52d5","Type":"ContainerDied","Data":"6c56799f9d42ab763c18e23603d8d02dbeae5c0bc0167fb521c17f9dd9372a8a"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.909772 4992 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8093e58dc2c6c71099d24769108b59f4c73d80c97ee5ed5e394699c3ceff3a30"} Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.910310 4992 scope.go:117] "RemoveContainer" containerID="6c56799f9d42ab763c18e23603d8d02dbeae5c0bc0167fb521c17f9dd9372a8a" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.919668 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-69l5h" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.936030 4992 scope.go:117] "RemoveContainer" containerID="cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.936179 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.976461 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-46cdx"] Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.979629 4992 scope.go:117] "RemoveContainer" containerID="c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511" Jan 31 09:37:48 crc kubenswrapper[4992]: I0131 09:37:48.980693 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-46cdx"] Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.006525 4992 scope.go:117] "RemoveContainer" containerID="0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.020301 4992 scope.go:117] "RemoveContainer" containerID="11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.032221 4992 scope.go:117] "RemoveContainer" containerID="1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.044780 4992 scope.go:117] "RemoveContainer" containerID="d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.057410 4992 scope.go:117] "RemoveContainer" containerID="d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.073210 4992 scope.go:117] "RemoveContainer" containerID="24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.136829 4992 scope.go:117] "RemoveContainer" containerID="ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.150186 4992 scope.go:117] "RemoveContainer" containerID="b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6" Jan 31 09:37:49 crc kubenswrapper[4992]: E0131 09:37:49.150658 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6\": container with ID starting with b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6 not found: ID does not exist" containerID="b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.150710 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6"} err="failed to get container status \"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6\": rpc error: code = NotFound desc = could not find container \"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6\": container with ID starting with b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.150738 4992 scope.go:117] "RemoveContainer" containerID="cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5" Jan 31 09:37:49 crc kubenswrapper[4992]: E0131 09:37:49.151037 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\": container with ID starting with cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5 not found: ID does not exist" containerID="cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.151060 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5"} err="failed to get container status \"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\": rpc error: code = NotFound desc = could not find container \"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\": container with ID starting with cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.151077 4992 scope.go:117] "RemoveContainer" containerID="c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511" Jan 31 09:37:49 crc kubenswrapper[4992]: E0131 09:37:49.151363 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\": container with ID starting with c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511 not found: ID does not exist" containerID="c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.151456 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511"} err="failed to get container status \"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\": rpc error: code = NotFound desc = could not find container \"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\": container with ID starting with c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.151484 4992 scope.go:117] "RemoveContainer" containerID="0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d" Jan 31 09:37:49 crc kubenswrapper[4992]: E0131 09:37:49.151776 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\": container with ID starting with 0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d not found: ID does not exist" containerID="0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.151807 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d"} err="failed to get container status \"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\": rpc error: code = NotFound desc = could not find container \"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\": container with ID starting with 0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.151822 4992 scope.go:117] "RemoveContainer" containerID="11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1" Jan 31 09:37:49 crc kubenswrapper[4992]: E0131 09:37:49.152078 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\": container with ID starting with 11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1 not found: ID does not exist" containerID="11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.152098 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1"} err="failed to get container status \"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\": rpc error: code = NotFound desc = could not find container \"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\": container with ID starting with 11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.152114 4992 scope.go:117] "RemoveContainer" containerID="1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109" Jan 31 09:37:49 crc kubenswrapper[4992]: E0131 09:37:49.152379 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\": container with ID starting with 1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109 not found: ID does not exist" containerID="1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.152400 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109"} err="failed to get container status \"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\": rpc error: code = NotFound desc = could not find container \"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\": container with ID starting with 1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.152442 4992 scope.go:117] "RemoveContainer" containerID="d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0" Jan 31 09:37:49 crc kubenswrapper[4992]: E0131 09:37:49.152779 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\": container with ID starting with d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0 not found: ID does not exist" containerID="d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.152808 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0"} err="failed to get container status \"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\": rpc error: code = NotFound desc = could not find container \"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\": container with ID starting with d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.152829 4992 scope.go:117] "RemoveContainer" containerID="d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e" Jan 31 09:37:49 crc kubenswrapper[4992]: E0131 09:37:49.153156 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\": container with ID starting with d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e not found: ID does not exist" containerID="d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.153182 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e"} err="failed to get container status \"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\": rpc error: code = NotFound desc = could not find container \"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\": container with ID starting with d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.153206 4992 scope.go:117] "RemoveContainer" containerID="24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed" Jan 31 09:37:49 crc kubenswrapper[4992]: E0131 09:37:49.153815 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\": container with ID starting with 24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed not found: ID does not exist" containerID="24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.153840 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed"} err="failed to get container status \"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\": rpc error: code = NotFound desc = could not find container \"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\": container with ID starting with 24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.153857 4992 scope.go:117] "RemoveContainer" containerID="ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66" Jan 31 09:37:49 crc kubenswrapper[4992]: E0131 09:37:49.154065 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\": container with ID starting with ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66 not found: ID does not exist" containerID="ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.154087 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66"} err="failed to get container status \"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\": rpc error: code = NotFound desc = could not find container \"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\": container with ID starting with ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.154104 4992 scope.go:117] "RemoveContainer" containerID="b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.154301 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6"} err="failed to get container status \"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6\": rpc error: code = NotFound desc = could not find container \"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6\": container with ID starting with b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.154321 4992 scope.go:117] "RemoveContainer" containerID="cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.154538 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5"} err="failed to get container status \"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\": rpc error: code = NotFound desc = could not find container \"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\": container with ID starting with cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.154556 4992 scope.go:117] "RemoveContainer" containerID="c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.154743 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511"} err="failed to get container status \"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\": rpc error: code = NotFound desc = could not find container \"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\": container with ID starting with c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.154763 4992 scope.go:117] "RemoveContainer" containerID="0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.154952 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d"} err="failed to get container status \"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\": rpc error: code = NotFound desc = could not find container \"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\": container with ID starting with 0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.154972 4992 scope.go:117] "RemoveContainer" containerID="11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.155167 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1"} err="failed to get container status \"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\": rpc error: code = NotFound desc = could not find container \"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\": container with ID starting with 11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.155187 4992 scope.go:117] "RemoveContainer" containerID="1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.155352 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109"} err="failed to get container status \"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\": rpc error: code = NotFound desc = could not find container \"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\": container with ID starting with 1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.155368 4992 scope.go:117] "RemoveContainer" containerID="d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.155630 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0"} err="failed to get container status \"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\": rpc error: code = NotFound desc = could not find container \"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\": container with ID starting with d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.155650 4992 scope.go:117] "RemoveContainer" containerID="d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.155840 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e"} err="failed to get container status \"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\": rpc error: code = NotFound desc = could not find container \"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\": container with ID starting with d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.155858 4992 scope.go:117] "RemoveContainer" containerID="24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.156043 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed"} err="failed to get container status \"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\": rpc error: code = NotFound desc = could not find container \"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\": container with ID starting with 24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.156060 4992 scope.go:117] "RemoveContainer" containerID="ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.156240 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66"} err="failed to get container status \"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\": rpc error: code = NotFound desc = could not find container \"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\": container with ID starting with ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.156256 4992 scope.go:117] "RemoveContainer" containerID="b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.156464 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6"} err="failed to get container status \"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6\": rpc error: code = NotFound desc = could not find container \"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6\": container with ID starting with b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.156482 4992 scope.go:117] "RemoveContainer" containerID="cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.156682 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5"} err="failed to get container status \"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\": rpc error: code = NotFound desc = could not find container \"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\": container with ID starting with cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.156700 4992 scope.go:117] "RemoveContainer" containerID="c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.156899 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511"} err="failed to get container status \"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\": rpc error: code = NotFound desc = could not find container \"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\": container with ID starting with c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.156917 4992 scope.go:117] "RemoveContainer" containerID="0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.157091 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d"} err="failed to get container status \"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\": rpc error: code = NotFound desc = could not find container \"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\": container with ID starting with 0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.157134 4992 scope.go:117] "RemoveContainer" containerID="11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.157346 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1"} err="failed to get container status \"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\": rpc error: code = NotFound desc = could not find container \"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\": container with ID starting with 11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.157371 4992 scope.go:117] "RemoveContainer" containerID="1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.157845 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109"} err="failed to get container status \"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\": rpc error: code = NotFound desc = could not find container \"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\": container with ID starting with 1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.157867 4992 scope.go:117] "RemoveContainer" containerID="d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.158074 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0"} err="failed to get container status \"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\": rpc error: code = NotFound desc = could not find container \"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\": container with ID starting with d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.158092 4992 scope.go:117] "RemoveContainer" containerID="d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.158303 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e"} err="failed to get container status \"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\": rpc error: code = NotFound desc = could not find container \"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\": container with ID starting with d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.158325 4992 scope.go:117] "RemoveContainer" containerID="24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.158629 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed"} err="failed to get container status \"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\": rpc error: code = NotFound desc = could not find container \"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\": container with ID starting with 24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.158646 4992 scope.go:117] "RemoveContainer" containerID="ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.158836 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66"} err="failed to get container status \"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\": rpc error: code = NotFound desc = could not find container \"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\": container with ID starting with ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.158859 4992 scope.go:117] "RemoveContainer" containerID="b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.159050 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6"} err="failed to get container status \"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6\": rpc error: code = NotFound desc = could not find container \"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6\": container with ID starting with b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.159072 4992 scope.go:117] "RemoveContainer" containerID="cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.159473 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5"} err="failed to get container status \"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\": rpc error: code = NotFound desc = could not find container \"cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5\": container with ID starting with cc64eba7f25dd6a3c68dd4787f7737aa19fc66a6e612efeb674b2ad042d033a5 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.159493 4992 scope.go:117] "RemoveContainer" containerID="c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.159703 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511"} err="failed to get container status \"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\": rpc error: code = NotFound desc = could not find container \"c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511\": container with ID starting with c4ff57901dab2e88fd19101a0c5eaac81a0c3f1a67988913841ae88b2e9ff511 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.159730 4992 scope.go:117] "RemoveContainer" containerID="0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.159918 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d"} err="failed to get container status \"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\": rpc error: code = NotFound desc = could not find container \"0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d\": container with ID starting with 0c575cfff7aa59828d58082345e5f10a4502edc1f7e28a295a974c10b908554d not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.159938 4992 scope.go:117] "RemoveContainer" containerID="11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.160167 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1"} err="failed to get container status \"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\": rpc error: code = NotFound desc = could not find container \"11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1\": container with ID starting with 11bb5cd87fd461747bd3c1ed8a48cb232fde33088e5884f9fd6647c443d170b1 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.160191 4992 scope.go:117] "RemoveContainer" containerID="1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.160954 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109"} err="failed to get container status \"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\": rpc error: code = NotFound desc = could not find container \"1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109\": container with ID starting with 1ce0fbca74791d26d0fbb4bbc2e18bc19814e2d2087ebf1b9fbb152e7cf3c109 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.160973 4992 scope.go:117] "RemoveContainer" containerID="d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.161333 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0"} err="failed to get container status \"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\": rpc error: code = NotFound desc = could not find container \"d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0\": container with ID starting with d76f240db98db56fe007accbab78d098825caad4ae80a202304bab0f28685eb0 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.161358 4992 scope.go:117] "RemoveContainer" containerID="d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.161606 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e"} err="failed to get container status \"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\": rpc error: code = NotFound desc = could not find container \"d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e\": container with ID starting with d69767684e9bdcf63e9f56245a52989c371a8062a4c1b3dc013a615ce9ee250e not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.161625 4992 scope.go:117] "RemoveContainer" containerID="24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.167666 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed"} err="failed to get container status \"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\": rpc error: code = NotFound desc = could not find container \"24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed\": container with ID starting with 24335aa6e95c807378e55a310aa861929b0c7df3a928fd5a09c298bb4e85f8ed not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.167741 4992 scope.go:117] "RemoveContainer" containerID="ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.168473 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66"} err="failed to get container status \"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\": rpc error: code = NotFound desc = could not find container \"ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66\": container with ID starting with ae1f259b396eccdc6561da02fa7f24426072d274e5f123d46a9a67041f069d66 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.168508 4992 scope.go:117] "RemoveContainer" containerID="b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.168965 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6"} err="failed to get container status \"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6\": rpc error: code = NotFound desc = could not find container \"b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6\": container with ID starting with b114d43597961c655f785151f06cab083202107be88045e93203c6ef10c9d0d6 not found: ID does not exist" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.194265 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6939ca32-c541-41c0-ba96-4282b942ff16" path="/var/lib/kubelet/pods/6939ca32-c541-41c0-ba96-4282b942ff16/volumes" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.916736 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bjplh_6bd42532-8655-4c14-991b-4cc36dea52d5/kube-multus/2.log" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.917525 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bjplh_6bd42532-8655-4c14-991b-4cc36dea52d5/kube-multus/1.log" Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.917623 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bjplh" event={"ID":"6bd42532-8655-4c14-991b-4cc36dea52d5","Type":"ContainerStarted","Data":"101dc7bdc2d583862ded05d1131ef96f849692fe36b3b796d920f4475e165e4f"} Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.919465 4992 generic.go:334] "Generic (PLEG): container finished" podID="31bc5549-6e44-46c0-9c49-9398e6311979" containerID="e9b6b52db1b79ed527c8bbb55f984ef104b887aead9abddc852fe150868e7c69" exitCode=0 Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.919571 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" event={"ID":"31bc5549-6e44-46c0-9c49-9398e6311979","Type":"ContainerDied","Data":"e9b6b52db1b79ed527c8bbb55f984ef104b887aead9abddc852fe150868e7c69"} Jan 31 09:37:49 crc kubenswrapper[4992]: I0131 09:37:49.919689 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" event={"ID":"31bc5549-6e44-46c0-9c49-9398e6311979","Type":"ContainerStarted","Data":"48e687240c2850565f925d3da5c99addc910641008a59d9c05b9df85f9a79bf8"} Jan 31 09:37:50 crc kubenswrapper[4992]: I0131 09:37:50.929609 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" event={"ID":"31bc5549-6e44-46c0-9c49-9398e6311979","Type":"ContainerStarted","Data":"3b02f1189e0d5b596c1ea143e40c81f2c1822a3d31b77161467b0b3362912904"} Jan 31 09:37:50 crc kubenswrapper[4992]: I0131 09:37:50.929934 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" event={"ID":"31bc5549-6e44-46c0-9c49-9398e6311979","Type":"ContainerStarted","Data":"02a007396e613d21c048ed12075e36d9d5cec6b8925bf472a2ec104995069201"} Jan 31 09:37:50 crc kubenswrapper[4992]: I0131 09:37:50.929954 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" event={"ID":"31bc5549-6e44-46c0-9c49-9398e6311979","Type":"ContainerStarted","Data":"66d169528fc4b960112d88d39977ff93d7f389ad193dd9250eedb0e4529a638e"} Jan 31 09:37:50 crc kubenswrapper[4992]: I0131 09:37:50.929971 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" event={"ID":"31bc5549-6e44-46c0-9c49-9398e6311979","Type":"ContainerStarted","Data":"35554635b9d9312048a17bbbd24eb68ddfc05e5a14fee11587eaa65fbd459032"} Jan 31 09:37:50 crc kubenswrapper[4992]: I0131 09:37:50.929991 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" event={"ID":"31bc5549-6e44-46c0-9c49-9398e6311979","Type":"ContainerStarted","Data":"cfb705f0106418864ac0372599ae1d016d1e446d0898cd585669db0ff6385a14"} Jan 31 09:37:50 crc kubenswrapper[4992]: I0131 09:37:50.930008 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" event={"ID":"31bc5549-6e44-46c0-9c49-9398e6311979","Type":"ContainerStarted","Data":"ab656d7f3fdaaf779b8750caac19550b2a8f53c8de8a59046635a4af6114f3df"} Jan 31 09:37:52 crc kubenswrapper[4992]: I0131 09:37:52.944492 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" event={"ID":"31bc5549-6e44-46c0-9c49-9398e6311979","Type":"ContainerStarted","Data":"e9547fcb67c96da72437a9ed1f8de6cfe51d3fe6df7486fd51cc1facb4be0b11"} Jan 31 09:37:55 crc kubenswrapper[4992]: I0131 09:37:55.964319 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" event={"ID":"31bc5549-6e44-46c0-9c49-9398e6311979","Type":"ContainerStarted","Data":"5310340c70512c023a4b854ce7a961cbb36fd49bd9f98965b303a3daddb6393d"} Jan 31 09:37:55 crc kubenswrapper[4992]: I0131 09:37:55.964943 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:55 crc kubenswrapper[4992]: I0131 09:37:55.964961 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:55 crc kubenswrapper[4992]: I0131 09:37:55.964972 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:55 crc kubenswrapper[4992]: I0131 09:37:55.993361 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" podStartSLOduration=7.993347717 podStartE2EDuration="7.993347717s" podCreationTimestamp="2026-01-31 09:37:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:37:55.990991989 +0000 UTC m=+771.962383986" watchObservedRunningTime="2026-01-31 09:37:55.993347717 +0000 UTC m=+771.964739704" Jan 31 09:37:55 crc kubenswrapper[4992]: I0131 09:37:55.997762 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:37:55 crc kubenswrapper[4992]: I0131 09:37:55.998681 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:38:05 crc kubenswrapper[4992]: I0131 09:38:05.512743 4992 scope.go:117] "RemoveContainer" containerID="8093e58dc2c6c71099d24769108b59f4c73d80c97ee5ed5e394699c3ceff3a30" Jan 31 09:38:08 crc kubenswrapper[4992]: I0131 09:38:08.037322 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bjplh_6bd42532-8655-4c14-991b-4cc36dea52d5/kube-multus/2.log" Jan 31 09:38:15 crc kubenswrapper[4992]: I0131 09:38:15.301775 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:38:15 crc kubenswrapper[4992]: I0131 09:38:15.302114 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:38:18 crc kubenswrapper[4992]: I0131 09:38:18.961552 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tqpkq" Jan 31 09:38:31 crc kubenswrapper[4992]: I0131 09:38:31.848736 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl"] Jan 31 09:38:31 crc kubenswrapper[4992]: I0131 09:38:31.850522 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" Jan 31 09:38:31 crc kubenswrapper[4992]: I0131 09:38:31.852581 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 31 09:38:31 crc kubenswrapper[4992]: I0131 09:38:31.863615 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl"] Jan 31 09:38:31 crc kubenswrapper[4992]: I0131 09:38:31.899695 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6db0abb-11b5-47ac-a974-497ff05312b8-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl\" (UID: \"c6db0abb-11b5-47ac-a974-497ff05312b8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" Jan 31 09:38:31 crc kubenswrapper[4992]: I0131 09:38:31.899783 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6db0abb-11b5-47ac-a974-497ff05312b8-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl\" (UID: \"c6db0abb-11b5-47ac-a974-497ff05312b8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" Jan 31 09:38:31 crc kubenswrapper[4992]: I0131 09:38:31.900233 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbjx4\" (UniqueName: \"kubernetes.io/projected/c6db0abb-11b5-47ac-a974-497ff05312b8-kube-api-access-rbjx4\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl\" (UID: \"c6db0abb-11b5-47ac-a974-497ff05312b8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" Jan 31 09:38:32 crc kubenswrapper[4992]: I0131 09:38:32.002292 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbjx4\" (UniqueName: \"kubernetes.io/projected/c6db0abb-11b5-47ac-a974-497ff05312b8-kube-api-access-rbjx4\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl\" (UID: \"c6db0abb-11b5-47ac-a974-497ff05312b8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" Jan 31 09:38:32 crc kubenswrapper[4992]: I0131 09:38:32.002687 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6db0abb-11b5-47ac-a974-497ff05312b8-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl\" (UID: \"c6db0abb-11b5-47ac-a974-497ff05312b8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" Jan 31 09:38:32 crc kubenswrapper[4992]: I0131 09:38:32.002827 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6db0abb-11b5-47ac-a974-497ff05312b8-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl\" (UID: \"c6db0abb-11b5-47ac-a974-497ff05312b8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" Jan 31 09:38:32 crc kubenswrapper[4992]: I0131 09:38:32.003184 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6db0abb-11b5-47ac-a974-497ff05312b8-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl\" (UID: \"c6db0abb-11b5-47ac-a974-497ff05312b8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" Jan 31 09:38:32 crc kubenswrapper[4992]: I0131 09:38:32.003277 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6db0abb-11b5-47ac-a974-497ff05312b8-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl\" (UID: \"c6db0abb-11b5-47ac-a974-497ff05312b8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" Jan 31 09:38:32 crc kubenswrapper[4992]: I0131 09:38:32.025871 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbjx4\" (UniqueName: \"kubernetes.io/projected/c6db0abb-11b5-47ac-a974-497ff05312b8-kube-api-access-rbjx4\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl\" (UID: \"c6db0abb-11b5-47ac-a974-497ff05312b8\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" Jan 31 09:38:32 crc kubenswrapper[4992]: I0131 09:38:32.167496 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" Jan 31 09:38:32 crc kubenswrapper[4992]: I0131 09:38:32.384120 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl"] Jan 31 09:38:33 crc kubenswrapper[4992]: I0131 09:38:33.190617 4992 generic.go:334] "Generic (PLEG): container finished" podID="c6db0abb-11b5-47ac-a974-497ff05312b8" containerID="9d01aa37c014cecec8dd79f34cd97ece5d89032b300dee19467018697f4af09e" exitCode=0 Jan 31 09:38:33 crc kubenswrapper[4992]: I0131 09:38:33.191861 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" event={"ID":"c6db0abb-11b5-47ac-a974-497ff05312b8","Type":"ContainerDied","Data":"9d01aa37c014cecec8dd79f34cd97ece5d89032b300dee19467018697f4af09e"} Jan 31 09:38:33 crc kubenswrapper[4992]: I0131 09:38:33.191898 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" event={"ID":"c6db0abb-11b5-47ac-a974-497ff05312b8","Type":"ContainerStarted","Data":"68bcac68ddd882cccfc643f1a24a50c0cdddf72efb8f0b3f23ad767e8ef2c3b5"} Jan 31 09:38:34 crc kubenswrapper[4992]: I0131 09:38:34.059175 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rwlfm"] Jan 31 09:38:34 crc kubenswrapper[4992]: I0131 09:38:34.060694 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:34 crc kubenswrapper[4992]: I0131 09:38:34.075754 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rwlfm"] Jan 31 09:38:34 crc kubenswrapper[4992]: I0131 09:38:34.137890 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3bf95b-592e-4d16-996e-e57175b19b28-catalog-content\") pod \"redhat-operators-rwlfm\" (UID: \"6a3bf95b-592e-4d16-996e-e57175b19b28\") " pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:34 crc kubenswrapper[4992]: I0131 09:38:34.137941 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3bf95b-592e-4d16-996e-e57175b19b28-utilities\") pod \"redhat-operators-rwlfm\" (UID: \"6a3bf95b-592e-4d16-996e-e57175b19b28\") " pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:34 crc kubenswrapper[4992]: I0131 09:38:34.137985 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-864hr\" (UniqueName: \"kubernetes.io/projected/6a3bf95b-592e-4d16-996e-e57175b19b28-kube-api-access-864hr\") pod \"redhat-operators-rwlfm\" (UID: \"6a3bf95b-592e-4d16-996e-e57175b19b28\") " pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:34 crc kubenswrapper[4992]: I0131 09:38:34.239351 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-864hr\" (UniqueName: \"kubernetes.io/projected/6a3bf95b-592e-4d16-996e-e57175b19b28-kube-api-access-864hr\") pod \"redhat-operators-rwlfm\" (UID: \"6a3bf95b-592e-4d16-996e-e57175b19b28\") " pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:34 crc kubenswrapper[4992]: I0131 09:38:34.239465 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3bf95b-592e-4d16-996e-e57175b19b28-catalog-content\") pod \"redhat-operators-rwlfm\" (UID: \"6a3bf95b-592e-4d16-996e-e57175b19b28\") " pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:34 crc kubenswrapper[4992]: I0131 09:38:34.239497 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3bf95b-592e-4d16-996e-e57175b19b28-utilities\") pod \"redhat-operators-rwlfm\" (UID: \"6a3bf95b-592e-4d16-996e-e57175b19b28\") " pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:34 crc kubenswrapper[4992]: I0131 09:38:34.239958 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3bf95b-592e-4d16-996e-e57175b19b28-utilities\") pod \"redhat-operators-rwlfm\" (UID: \"6a3bf95b-592e-4d16-996e-e57175b19b28\") " pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:34 crc kubenswrapper[4992]: I0131 09:38:34.240045 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3bf95b-592e-4d16-996e-e57175b19b28-catalog-content\") pod \"redhat-operators-rwlfm\" (UID: \"6a3bf95b-592e-4d16-996e-e57175b19b28\") " pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:34 crc kubenswrapper[4992]: I0131 09:38:34.261979 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-864hr\" (UniqueName: \"kubernetes.io/projected/6a3bf95b-592e-4d16-996e-e57175b19b28-kube-api-access-864hr\") pod \"redhat-operators-rwlfm\" (UID: \"6a3bf95b-592e-4d16-996e-e57175b19b28\") " pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:34 crc kubenswrapper[4992]: I0131 09:38:34.387709 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:34 crc kubenswrapper[4992]: I0131 09:38:34.586829 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rwlfm"] Jan 31 09:38:35 crc kubenswrapper[4992]: I0131 09:38:35.201290 4992 generic.go:334] "Generic (PLEG): container finished" podID="6a3bf95b-592e-4d16-996e-e57175b19b28" containerID="9ddb6574b80800af3efd41363c4d8f7d332d49e77a4a9b3548c29f1bc02da4aa" exitCode=0 Jan 31 09:38:35 crc kubenswrapper[4992]: I0131 09:38:35.201396 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rwlfm" event={"ID":"6a3bf95b-592e-4d16-996e-e57175b19b28","Type":"ContainerDied","Data":"9ddb6574b80800af3efd41363c4d8f7d332d49e77a4a9b3548c29f1bc02da4aa"} Jan 31 09:38:35 crc kubenswrapper[4992]: I0131 09:38:35.201618 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rwlfm" event={"ID":"6a3bf95b-592e-4d16-996e-e57175b19b28","Type":"ContainerStarted","Data":"8481f6d0e3630030af23c417f676706961da7f3a47aadddcf76c14492db5d83b"} Jan 31 09:38:37 crc kubenswrapper[4992]: I0131 09:38:37.213039 4992 generic.go:334] "Generic (PLEG): container finished" podID="c6db0abb-11b5-47ac-a974-497ff05312b8" containerID="0cff99eed683d587588379846a1b51b9bdc4b03836d423763f1d5f33ae4821ee" exitCode=0 Jan 31 09:38:37 crc kubenswrapper[4992]: I0131 09:38:37.213090 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" event={"ID":"c6db0abb-11b5-47ac-a974-497ff05312b8","Type":"ContainerDied","Data":"0cff99eed683d587588379846a1b51b9bdc4b03836d423763f1d5f33ae4821ee"} Jan 31 09:38:38 crc kubenswrapper[4992]: I0131 09:38:38.220773 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rwlfm" event={"ID":"6a3bf95b-592e-4d16-996e-e57175b19b28","Type":"ContainerStarted","Data":"c3cea06038dcb02b85843df57a523cafc85b86082b0f653b2e0fedbcc5758adb"} Jan 31 09:38:38 crc kubenswrapper[4992]: I0131 09:38:38.224296 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" event={"ID":"c6db0abb-11b5-47ac-a974-497ff05312b8","Type":"ContainerStarted","Data":"0fbc8266726dace13daaeffdf54996b76d77c626fc2b646f6a3a2a669fa416eb"} Jan 31 09:38:38 crc kubenswrapper[4992]: I0131 09:38:38.276127 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" podStartSLOduration=4.236124667 podStartE2EDuration="7.276108311s" podCreationTimestamp="2026-01-31 09:38:31 +0000 UTC" firstStartedPulling="2026-01-31 09:38:33.191976659 +0000 UTC m=+809.163368646" lastFinishedPulling="2026-01-31 09:38:36.231960303 +0000 UTC m=+812.203352290" observedRunningTime="2026-01-31 09:38:38.274315329 +0000 UTC m=+814.245707316" watchObservedRunningTime="2026-01-31 09:38:38.276108311 +0000 UTC m=+814.247500298" Jan 31 09:38:39 crc kubenswrapper[4992]: I0131 09:38:39.230916 4992 generic.go:334] "Generic (PLEG): container finished" podID="6a3bf95b-592e-4d16-996e-e57175b19b28" containerID="c3cea06038dcb02b85843df57a523cafc85b86082b0f653b2e0fedbcc5758adb" exitCode=0 Jan 31 09:38:39 crc kubenswrapper[4992]: I0131 09:38:39.230969 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rwlfm" event={"ID":"6a3bf95b-592e-4d16-996e-e57175b19b28","Type":"ContainerDied","Data":"c3cea06038dcb02b85843df57a523cafc85b86082b0f653b2e0fedbcc5758adb"} Jan 31 09:38:39 crc kubenswrapper[4992]: I0131 09:38:39.234094 4992 generic.go:334] "Generic (PLEG): container finished" podID="c6db0abb-11b5-47ac-a974-497ff05312b8" containerID="0fbc8266726dace13daaeffdf54996b76d77c626fc2b646f6a3a2a669fa416eb" exitCode=0 Jan 31 09:38:39 crc kubenswrapper[4992]: I0131 09:38:39.234118 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" event={"ID":"c6db0abb-11b5-47ac-a974-497ff05312b8","Type":"ContainerDied","Data":"0fbc8266726dace13daaeffdf54996b76d77c626fc2b646f6a3a2a669fa416eb"} Jan 31 09:38:40 crc kubenswrapper[4992]: I0131 09:38:40.453359 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" Jan 31 09:38:40 crc kubenswrapper[4992]: I0131 09:38:40.551736 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6db0abb-11b5-47ac-a974-497ff05312b8-bundle\") pod \"c6db0abb-11b5-47ac-a974-497ff05312b8\" (UID: \"c6db0abb-11b5-47ac-a974-497ff05312b8\") " Jan 31 09:38:40 crc kubenswrapper[4992]: I0131 09:38:40.552070 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6db0abb-11b5-47ac-a974-497ff05312b8-util\") pod \"c6db0abb-11b5-47ac-a974-497ff05312b8\" (UID: \"c6db0abb-11b5-47ac-a974-497ff05312b8\") " Jan 31 09:38:40 crc kubenswrapper[4992]: I0131 09:38:40.552141 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbjx4\" (UniqueName: \"kubernetes.io/projected/c6db0abb-11b5-47ac-a974-497ff05312b8-kube-api-access-rbjx4\") pod \"c6db0abb-11b5-47ac-a974-497ff05312b8\" (UID: \"c6db0abb-11b5-47ac-a974-497ff05312b8\") " Jan 31 09:38:40 crc kubenswrapper[4992]: I0131 09:38:40.552574 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6db0abb-11b5-47ac-a974-497ff05312b8-bundle" (OuterVolumeSpecName: "bundle") pod "c6db0abb-11b5-47ac-a974-497ff05312b8" (UID: "c6db0abb-11b5-47ac-a974-497ff05312b8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:38:40 crc kubenswrapper[4992]: I0131 09:38:40.562088 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6db0abb-11b5-47ac-a974-497ff05312b8-util" (OuterVolumeSpecName: "util") pod "c6db0abb-11b5-47ac-a974-497ff05312b8" (UID: "c6db0abb-11b5-47ac-a974-497ff05312b8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:38:40 crc kubenswrapper[4992]: I0131 09:38:40.562255 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6db0abb-11b5-47ac-a974-497ff05312b8-kube-api-access-rbjx4" (OuterVolumeSpecName: "kube-api-access-rbjx4") pod "c6db0abb-11b5-47ac-a974-497ff05312b8" (UID: "c6db0abb-11b5-47ac-a974-497ff05312b8"). InnerVolumeSpecName "kube-api-access-rbjx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:38:40 crc kubenswrapper[4992]: I0131 09:38:40.653699 4992 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6db0abb-11b5-47ac-a974-497ff05312b8-util\") on node \"crc\" DevicePath \"\"" Jan 31 09:38:40 crc kubenswrapper[4992]: I0131 09:38:40.653743 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbjx4\" (UniqueName: \"kubernetes.io/projected/c6db0abb-11b5-47ac-a974-497ff05312b8-kube-api-access-rbjx4\") on node \"crc\" DevicePath \"\"" Jan 31 09:38:40 crc kubenswrapper[4992]: I0131 09:38:40.653759 4992 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6db0abb-11b5-47ac-a974-497ff05312b8-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:38:41 crc kubenswrapper[4992]: I0131 09:38:41.246199 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rwlfm" event={"ID":"6a3bf95b-592e-4d16-996e-e57175b19b28","Type":"ContainerStarted","Data":"c12b5eda50a6f026dd2a2cc91b13dc1e0f56331382649bb5211a190680398c5a"} Jan 31 09:38:41 crc kubenswrapper[4992]: I0131 09:38:41.248724 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" event={"ID":"c6db0abb-11b5-47ac-a974-497ff05312b8","Type":"ContainerDied","Data":"68bcac68ddd882cccfc643f1a24a50c0cdddf72efb8f0b3f23ad767e8ef2c3b5"} Jan 31 09:38:41 crc kubenswrapper[4992]: I0131 09:38:41.248897 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68bcac68ddd882cccfc643f1a24a50c0cdddf72efb8f0b3f23ad767e8ef2c3b5" Jan 31 09:38:41 crc kubenswrapper[4992]: I0131 09:38:41.248828 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl" Jan 31 09:38:41 crc kubenswrapper[4992]: I0131 09:38:41.266190 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rwlfm" podStartSLOduration=2.306402754 podStartE2EDuration="7.266168554s" podCreationTimestamp="2026-01-31 09:38:34 +0000 UTC" firstStartedPulling="2026-01-31 09:38:35.202465804 +0000 UTC m=+811.173857781" lastFinishedPulling="2026-01-31 09:38:40.162231594 +0000 UTC m=+816.133623581" observedRunningTime="2026-01-31 09:38:41.261913502 +0000 UTC m=+817.233305499" watchObservedRunningTime="2026-01-31 09:38:41.266168554 +0000 UTC m=+817.237560541" Jan 31 09:38:42 crc kubenswrapper[4992]: I0131 09:38:42.376976 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-62m4h"] Jan 31 09:38:42 crc kubenswrapper[4992]: E0131 09:38:42.377218 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6db0abb-11b5-47ac-a974-497ff05312b8" containerName="util" Jan 31 09:38:42 crc kubenswrapper[4992]: I0131 09:38:42.377233 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6db0abb-11b5-47ac-a974-497ff05312b8" containerName="util" Jan 31 09:38:42 crc kubenswrapper[4992]: E0131 09:38:42.377252 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6db0abb-11b5-47ac-a974-497ff05312b8" containerName="pull" Jan 31 09:38:42 crc kubenswrapper[4992]: I0131 09:38:42.377261 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6db0abb-11b5-47ac-a974-497ff05312b8" containerName="pull" Jan 31 09:38:42 crc kubenswrapper[4992]: E0131 09:38:42.377272 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6db0abb-11b5-47ac-a974-497ff05312b8" containerName="extract" Jan 31 09:38:42 crc kubenswrapper[4992]: I0131 09:38:42.377280 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6db0abb-11b5-47ac-a974-497ff05312b8" containerName="extract" Jan 31 09:38:42 crc kubenswrapper[4992]: I0131 09:38:42.377406 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6db0abb-11b5-47ac-a974-497ff05312b8" containerName="extract" Jan 31 09:38:42 crc kubenswrapper[4992]: I0131 09:38:42.377878 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-62m4h" Jan 31 09:38:42 crc kubenswrapper[4992]: I0131 09:38:42.380432 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 31 09:38:42 crc kubenswrapper[4992]: I0131 09:38:42.380482 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 31 09:38:42 crc kubenswrapper[4992]: I0131 09:38:42.382566 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-gmh4n" Jan 31 09:38:42 crc kubenswrapper[4992]: I0131 09:38:42.390262 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-62m4h"] Jan 31 09:38:42 crc kubenswrapper[4992]: I0131 09:38:42.492756 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdg5k\" (UniqueName: \"kubernetes.io/projected/7b9dd266-01bc-4f6c-8f1e-a2e0711081fc-kube-api-access-fdg5k\") pod \"nmstate-operator-646758c888-62m4h\" (UID: \"7b9dd266-01bc-4f6c-8f1e-a2e0711081fc\") " pod="openshift-nmstate/nmstate-operator-646758c888-62m4h" Jan 31 09:38:42 crc kubenswrapper[4992]: I0131 09:38:42.593900 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdg5k\" (UniqueName: \"kubernetes.io/projected/7b9dd266-01bc-4f6c-8f1e-a2e0711081fc-kube-api-access-fdg5k\") pod \"nmstate-operator-646758c888-62m4h\" (UID: \"7b9dd266-01bc-4f6c-8f1e-a2e0711081fc\") " pod="openshift-nmstate/nmstate-operator-646758c888-62m4h" Jan 31 09:38:42 crc kubenswrapper[4992]: I0131 09:38:42.611602 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdg5k\" (UniqueName: \"kubernetes.io/projected/7b9dd266-01bc-4f6c-8f1e-a2e0711081fc-kube-api-access-fdg5k\") pod \"nmstate-operator-646758c888-62m4h\" (UID: \"7b9dd266-01bc-4f6c-8f1e-a2e0711081fc\") " pod="openshift-nmstate/nmstate-operator-646758c888-62m4h" Jan 31 09:38:42 crc kubenswrapper[4992]: I0131 09:38:42.691853 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-62m4h" Jan 31 09:38:42 crc kubenswrapper[4992]: I0131 09:38:42.876660 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-62m4h"] Jan 31 09:38:42 crc kubenswrapper[4992]: W0131 09:38:42.883382 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b9dd266_01bc_4f6c_8f1e_a2e0711081fc.slice/crio-35c4a160770ccbb8b42c0a64afb42cf1ed3861038499ae2e267e73ba92a7fc7a WatchSource:0}: Error finding container 35c4a160770ccbb8b42c0a64afb42cf1ed3861038499ae2e267e73ba92a7fc7a: Status 404 returned error can't find the container with id 35c4a160770ccbb8b42c0a64afb42cf1ed3861038499ae2e267e73ba92a7fc7a Jan 31 09:38:43 crc kubenswrapper[4992]: I0131 09:38:43.259085 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-62m4h" event={"ID":"7b9dd266-01bc-4f6c-8f1e-a2e0711081fc","Type":"ContainerStarted","Data":"35c4a160770ccbb8b42c0a64afb42cf1ed3861038499ae2e267e73ba92a7fc7a"} Jan 31 09:38:44 crc kubenswrapper[4992]: I0131 09:38:44.388129 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:44 crc kubenswrapper[4992]: I0131 09:38:44.388279 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:45 crc kubenswrapper[4992]: I0131 09:38:45.301358 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:38:45 crc kubenswrapper[4992]: I0131 09:38:45.301749 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:38:45 crc kubenswrapper[4992]: I0131 09:38:45.301808 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:38:45 crc kubenswrapper[4992]: I0131 09:38:45.302490 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dcfc3dfd610126640e5e892b66ddc4eb9fe55b9da0dd6e87ee131f4f08a55e7c"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:38:45 crc kubenswrapper[4992]: I0131 09:38:45.302541 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://dcfc3dfd610126640e5e892b66ddc4eb9fe55b9da0dd6e87ee131f4f08a55e7c" gracePeriod=600 Jan 31 09:38:45 crc kubenswrapper[4992]: I0131 09:38:45.435006 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rwlfm" podUID="6a3bf95b-592e-4d16-996e-e57175b19b28" containerName="registry-server" probeResult="failure" output=< Jan 31 09:38:45 crc kubenswrapper[4992]: timeout: failed to connect service ":50051" within 1s Jan 31 09:38:45 crc kubenswrapper[4992]: > Jan 31 09:38:46 crc kubenswrapper[4992]: I0131 09:38:46.285522 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="dcfc3dfd610126640e5e892b66ddc4eb9fe55b9da0dd6e87ee131f4f08a55e7c" exitCode=0 Jan 31 09:38:46 crc kubenswrapper[4992]: I0131 09:38:46.285567 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"dcfc3dfd610126640e5e892b66ddc4eb9fe55b9da0dd6e87ee131f4f08a55e7c"} Jan 31 09:38:46 crc kubenswrapper[4992]: I0131 09:38:46.285600 4992 scope.go:117] "RemoveContainer" containerID="985e7c05e12ab986daeed0403a8d03b948841da13cf4731f55f5b3f5414e74ac" Jan 31 09:38:48 crc kubenswrapper[4992]: I0131 09:38:48.301977 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"56fd2e562c473f9f02a32edbe3694b09ca6daec109306548ace480ef8bb463a3"} Jan 31 09:38:48 crc kubenswrapper[4992]: I0131 09:38:48.303750 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-62m4h" event={"ID":"7b9dd266-01bc-4f6c-8f1e-a2e0711081fc","Type":"ContainerStarted","Data":"7c0d637e8064cbb6f14e141317efd69f8278a3a474713e850fe0525efdfe0a81"} Jan 31 09:38:48 crc kubenswrapper[4992]: I0131 09:38:48.348029 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-62m4h" podStartSLOduration=2.324444527 podStartE2EDuration="6.348013254s" podCreationTimestamp="2026-01-31 09:38:42 +0000 UTC" firstStartedPulling="2026-01-31 09:38:42.887487 +0000 UTC m=+818.858878987" lastFinishedPulling="2026-01-31 09:38:46.911055717 +0000 UTC m=+822.882447714" observedRunningTime="2026-01-31 09:38:48.345126431 +0000 UTC m=+824.316518428" watchObservedRunningTime="2026-01-31 09:38:48.348013254 +0000 UTC m=+824.319405231" Jan 31 09:38:51 crc kubenswrapper[4992]: I0131 09:38:51.935879 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-sd59f"] Jan 31 09:38:51 crc kubenswrapper[4992]: I0131 09:38:51.936970 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-sd59f" Jan 31 09:38:51 crc kubenswrapper[4992]: I0131 09:38:51.938638 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-h7ws8" Jan 31 09:38:51 crc kubenswrapper[4992]: I0131 09:38:51.946722 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-sd59f"] Jan 31 09:38:51 crc kubenswrapper[4992]: I0131 09:38:51.965167 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-v6rfm"] Jan 31 09:38:51 crc kubenswrapper[4992]: I0131 09:38:51.966613 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:38:51 crc kubenswrapper[4992]: I0131 09:38:51.989872 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87"] Jan 31 09:38:51 crc kubenswrapper[4992]: I0131 09:38:51.990811 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87" Jan 31 09:38:51 crc kubenswrapper[4992]: I0131 09:38:51.993558 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.009202 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87"] Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.030547 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/90afa0e0-1dc5-441a-a8f9-5f26b53ebe34-nmstate-lock\") pod \"nmstate-handler-v6rfm\" (UID: \"90afa0e0-1dc5-441a-a8f9-5f26b53ebe34\") " pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.030635 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg9lw\" (UniqueName: \"kubernetes.io/projected/90afa0e0-1dc5-441a-a8f9-5f26b53ebe34-kube-api-access-xg9lw\") pod \"nmstate-handler-v6rfm\" (UID: \"90afa0e0-1dc5-441a-a8f9-5f26b53ebe34\") " pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.030712 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/90afa0e0-1dc5-441a-a8f9-5f26b53ebe34-dbus-socket\") pod \"nmstate-handler-v6rfm\" (UID: \"90afa0e0-1dc5-441a-a8f9-5f26b53ebe34\") " pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.030743 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/90afa0e0-1dc5-441a-a8f9-5f26b53ebe34-ovs-socket\") pod \"nmstate-handler-v6rfm\" (UID: \"90afa0e0-1dc5-441a-a8f9-5f26b53ebe34\") " pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.030916 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn6rj\" (UniqueName: \"kubernetes.io/projected/9e2f9f80-b7a6-4a51-b481-723b3b0daad7-kube-api-access-xn6rj\") pod \"nmstate-metrics-54757c584b-sd59f\" (UID: \"9e2f9f80-b7a6-4a51-b481-723b3b0daad7\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-sd59f" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.076188 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw"] Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.077105 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.091884 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.091986 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.092084 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-fzxqq" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.093848 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw"] Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.133131 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4cff7e-7c67-4840-8b78-ca21eb4e1abf-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-d4xvw\" (UID: \"ae4cff7e-7c67-4840-8b78-ca21eb4e1abf\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.133230 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg9lw\" (UniqueName: \"kubernetes.io/projected/90afa0e0-1dc5-441a-a8f9-5f26b53ebe34-kube-api-access-xg9lw\") pod \"nmstate-handler-v6rfm\" (UID: \"90afa0e0-1dc5-441a-a8f9-5f26b53ebe34\") " pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.133309 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/90afa0e0-1dc5-441a-a8f9-5f26b53ebe34-dbus-socket\") pod \"nmstate-handler-v6rfm\" (UID: \"90afa0e0-1dc5-441a-a8f9-5f26b53ebe34\") " pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.133334 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtnq2\" (UniqueName: \"kubernetes.io/projected/aa44611e-4b2f-4d88-bc6f-04146843bbae-kube-api-access-qtnq2\") pod \"nmstate-webhook-8474b5b9d8-fhb87\" (UID: \"aa44611e-4b2f-4d88-bc6f-04146843bbae\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.133362 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/90afa0e0-1dc5-441a-a8f9-5f26b53ebe34-ovs-socket\") pod \"nmstate-handler-v6rfm\" (UID: \"90afa0e0-1dc5-441a-a8f9-5f26b53ebe34\") " pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.133382 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ae4cff7e-7c67-4840-8b78-ca21eb4e1abf-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-d4xvw\" (UID: \"ae4cff7e-7c67-4840-8b78-ca21eb4e1abf\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.133410 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42k2r\" (UniqueName: \"kubernetes.io/projected/ae4cff7e-7c67-4840-8b78-ca21eb4e1abf-kube-api-access-42k2r\") pod \"nmstate-console-plugin-7754f76f8b-d4xvw\" (UID: \"ae4cff7e-7c67-4840-8b78-ca21eb4e1abf\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.133446 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn6rj\" (UniqueName: \"kubernetes.io/projected/9e2f9f80-b7a6-4a51-b481-723b3b0daad7-kube-api-access-xn6rj\") pod \"nmstate-metrics-54757c584b-sd59f\" (UID: \"9e2f9f80-b7a6-4a51-b481-723b3b0daad7\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-sd59f" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.133479 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aa44611e-4b2f-4d88-bc6f-04146843bbae-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-fhb87\" (UID: \"aa44611e-4b2f-4d88-bc6f-04146843bbae\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.133499 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/90afa0e0-1dc5-441a-a8f9-5f26b53ebe34-nmstate-lock\") pod \"nmstate-handler-v6rfm\" (UID: \"90afa0e0-1dc5-441a-a8f9-5f26b53ebe34\") " pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.133591 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/90afa0e0-1dc5-441a-a8f9-5f26b53ebe34-nmstate-lock\") pod \"nmstate-handler-v6rfm\" (UID: \"90afa0e0-1dc5-441a-a8f9-5f26b53ebe34\") " pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.133608 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/90afa0e0-1dc5-441a-a8f9-5f26b53ebe34-dbus-socket\") pod \"nmstate-handler-v6rfm\" (UID: \"90afa0e0-1dc5-441a-a8f9-5f26b53ebe34\") " pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.133657 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/90afa0e0-1dc5-441a-a8f9-5f26b53ebe34-ovs-socket\") pod \"nmstate-handler-v6rfm\" (UID: \"90afa0e0-1dc5-441a-a8f9-5f26b53ebe34\") " pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.158579 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg9lw\" (UniqueName: \"kubernetes.io/projected/90afa0e0-1dc5-441a-a8f9-5f26b53ebe34-kube-api-access-xg9lw\") pod \"nmstate-handler-v6rfm\" (UID: \"90afa0e0-1dc5-441a-a8f9-5f26b53ebe34\") " pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.165791 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn6rj\" (UniqueName: \"kubernetes.io/projected/9e2f9f80-b7a6-4a51-b481-723b3b0daad7-kube-api-access-xn6rj\") pod \"nmstate-metrics-54757c584b-sd59f\" (UID: \"9e2f9f80-b7a6-4a51-b481-723b3b0daad7\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-sd59f" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.234445 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtnq2\" (UniqueName: \"kubernetes.io/projected/aa44611e-4b2f-4d88-bc6f-04146843bbae-kube-api-access-qtnq2\") pod \"nmstate-webhook-8474b5b9d8-fhb87\" (UID: \"aa44611e-4b2f-4d88-bc6f-04146843bbae\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.234512 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ae4cff7e-7c67-4840-8b78-ca21eb4e1abf-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-d4xvw\" (UID: \"ae4cff7e-7c67-4840-8b78-ca21eb4e1abf\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.234555 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42k2r\" (UniqueName: \"kubernetes.io/projected/ae4cff7e-7c67-4840-8b78-ca21eb4e1abf-kube-api-access-42k2r\") pod \"nmstate-console-plugin-7754f76f8b-d4xvw\" (UID: \"ae4cff7e-7c67-4840-8b78-ca21eb4e1abf\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.234596 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aa44611e-4b2f-4d88-bc6f-04146843bbae-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-fhb87\" (UID: \"aa44611e-4b2f-4d88-bc6f-04146843bbae\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.234639 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4cff7e-7c67-4840-8b78-ca21eb4e1abf-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-d4xvw\" (UID: \"ae4cff7e-7c67-4840-8b78-ca21eb4e1abf\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw" Jan 31 09:38:52 crc kubenswrapper[4992]: E0131 09:38:52.234789 4992 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Jan 31 09:38:52 crc kubenswrapper[4992]: E0131 09:38:52.234850 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae4cff7e-7c67-4840-8b78-ca21eb4e1abf-plugin-serving-cert podName:ae4cff7e-7c67-4840-8b78-ca21eb4e1abf nodeName:}" failed. No retries permitted until 2026-01-31 09:38:52.734828627 +0000 UTC m=+828.706220614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/ae4cff7e-7c67-4840-8b78-ca21eb4e1abf-plugin-serving-cert") pod "nmstate-console-plugin-7754f76f8b-d4xvw" (UID: "ae4cff7e-7c67-4840-8b78-ca21eb4e1abf") : secret "plugin-serving-cert" not found Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.235978 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ae4cff7e-7c67-4840-8b78-ca21eb4e1abf-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-d4xvw\" (UID: \"ae4cff7e-7c67-4840-8b78-ca21eb4e1abf\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.243484 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aa44611e-4b2f-4d88-bc6f-04146843bbae-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-fhb87\" (UID: \"aa44611e-4b2f-4d88-bc6f-04146843bbae\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.252748 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-sd59f" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.255254 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42k2r\" (UniqueName: \"kubernetes.io/projected/ae4cff7e-7c67-4840-8b78-ca21eb4e1abf-kube-api-access-42k2r\") pod \"nmstate-console-plugin-7754f76f8b-d4xvw\" (UID: \"ae4cff7e-7c67-4840-8b78-ca21eb4e1abf\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.259026 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtnq2\" (UniqueName: \"kubernetes.io/projected/aa44611e-4b2f-4d88-bc6f-04146843bbae-kube-api-access-qtnq2\") pod \"nmstate-webhook-8474b5b9d8-fhb87\" (UID: \"aa44611e-4b2f-4d88-bc6f-04146843bbae\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.280654 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.305932 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-54465874f9-hw7pd"] Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.311276 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.316366 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.343151 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54465874f9-hw7pd"] Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.438370 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89128d77-7d06-4ca9-8df2-e5156bc0fec9-console-oauth-config\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.439173 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89128d77-7d06-4ca9-8df2-e5156bc0fec9-console-serving-cert\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.439194 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbx9r\" (UniqueName: \"kubernetes.io/projected/89128d77-7d06-4ca9-8df2-e5156bc0fec9-kube-api-access-mbx9r\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.439225 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89128d77-7d06-4ca9-8df2-e5156bc0fec9-oauth-serving-cert\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.439282 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89128d77-7d06-4ca9-8df2-e5156bc0fec9-trusted-ca-bundle\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.439304 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89128d77-7d06-4ca9-8df2-e5156bc0fec9-console-config\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.439326 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89128d77-7d06-4ca9-8df2-e5156bc0fec9-service-ca\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.517160 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-sd59f"] Jan 31 09:38:52 crc kubenswrapper[4992]: W0131 09:38:52.522903 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e2f9f80_b7a6_4a51_b481_723b3b0daad7.slice/crio-f8058341862988f89d0b6b0a06d1234dee6ffc28bd3610ad015ab4ed358a4ee3 WatchSource:0}: Error finding container f8058341862988f89d0b6b0a06d1234dee6ffc28bd3610ad015ab4ed358a4ee3: Status 404 returned error can't find the container with id f8058341862988f89d0b6b0a06d1234dee6ffc28bd3610ad015ab4ed358a4ee3 Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.540184 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89128d77-7d06-4ca9-8df2-e5156bc0fec9-console-oauth-config\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.540236 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89128d77-7d06-4ca9-8df2-e5156bc0fec9-console-serving-cert\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.540253 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbx9r\" (UniqueName: \"kubernetes.io/projected/89128d77-7d06-4ca9-8df2-e5156bc0fec9-kube-api-access-mbx9r\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.540282 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89128d77-7d06-4ca9-8df2-e5156bc0fec9-oauth-serving-cert\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.540309 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89128d77-7d06-4ca9-8df2-e5156bc0fec9-trusted-ca-bundle\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.540330 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89128d77-7d06-4ca9-8df2-e5156bc0fec9-console-config\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.540355 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89128d77-7d06-4ca9-8df2-e5156bc0fec9-service-ca\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.541454 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/89128d77-7d06-4ca9-8df2-e5156bc0fec9-service-ca\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.542034 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/89128d77-7d06-4ca9-8df2-e5156bc0fec9-console-config\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.542063 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/89128d77-7d06-4ca9-8df2-e5156bc0fec9-oauth-serving-cert\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.542318 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89128d77-7d06-4ca9-8df2-e5156bc0fec9-trusted-ca-bundle\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.546205 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/89128d77-7d06-4ca9-8df2-e5156bc0fec9-console-oauth-config\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.557214 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/89128d77-7d06-4ca9-8df2-e5156bc0fec9-console-serving-cert\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.558341 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbx9r\" (UniqueName: \"kubernetes.io/projected/89128d77-7d06-4ca9-8df2-e5156bc0fec9-kube-api-access-mbx9r\") pod \"console-54465874f9-hw7pd\" (UID: \"89128d77-7d06-4ca9-8df2-e5156bc0fec9\") " pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.578011 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87"] Jan 31 09:38:52 crc kubenswrapper[4992]: W0131 09:38:52.580683 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa44611e_4b2f_4d88_bc6f_04146843bbae.slice/crio-4456db72c58e8ec5b5f0761f2e86a883b4ebec6cea8384abe970a2b3ed0853cb WatchSource:0}: Error finding container 4456db72c58e8ec5b5f0761f2e86a883b4ebec6cea8384abe970a2b3ed0853cb: Status 404 returned error can't find the container with id 4456db72c58e8ec5b5f0761f2e86a883b4ebec6cea8384abe970a2b3ed0853cb Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.660269 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.743070 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4cff7e-7c67-4840-8b78-ca21eb4e1abf-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-d4xvw\" (UID: \"ae4cff7e-7c67-4840-8b78-ca21eb4e1abf\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.746480 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4cff7e-7c67-4840-8b78-ca21eb4e1abf-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-d4xvw\" (UID: \"ae4cff7e-7c67-4840-8b78-ca21eb4e1abf\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw" Jan 31 09:38:52 crc kubenswrapper[4992]: I0131 09:38:52.877806 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54465874f9-hw7pd"] Jan 31 09:38:53 crc kubenswrapper[4992]: I0131 09:38:53.020727 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw" Jan 31 09:38:53 crc kubenswrapper[4992]: I0131 09:38:53.212836 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw"] Jan 31 09:38:53 crc kubenswrapper[4992]: I0131 09:38:53.352772 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87" event={"ID":"aa44611e-4b2f-4d88-bc6f-04146843bbae","Type":"ContainerStarted","Data":"4456db72c58e8ec5b5f0761f2e86a883b4ebec6cea8384abe970a2b3ed0853cb"} Jan 31 09:38:53 crc kubenswrapper[4992]: I0131 09:38:53.353906 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-sd59f" event={"ID":"9e2f9f80-b7a6-4a51-b481-723b3b0daad7","Type":"ContainerStarted","Data":"f8058341862988f89d0b6b0a06d1234dee6ffc28bd3610ad015ab4ed358a4ee3"} Jan 31 09:38:53 crc kubenswrapper[4992]: I0131 09:38:53.355859 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54465874f9-hw7pd" event={"ID":"89128d77-7d06-4ca9-8df2-e5156bc0fec9","Type":"ContainerStarted","Data":"6873d9fb492bf762a4ec1bb9fd3d3c19370881f9247e3db5299d3d79175349d0"} Jan 31 09:38:53 crc kubenswrapper[4992]: I0131 09:38:53.355903 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54465874f9-hw7pd" event={"ID":"89128d77-7d06-4ca9-8df2-e5156bc0fec9","Type":"ContainerStarted","Data":"e70dc60e6884124784e3b8ac10d598dfa03f473e2929e0a531e2f8d3f6a7a212"} Jan 31 09:38:53 crc kubenswrapper[4992]: I0131 09:38:53.357579 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw" event={"ID":"ae4cff7e-7c67-4840-8b78-ca21eb4e1abf","Type":"ContainerStarted","Data":"a1585cb0c2ef8c80908c0b897999705bfdd8684a8e00c91c7a5f97f38df417d1"} Jan 31 09:38:53 crc kubenswrapper[4992]: I0131 09:38:53.358506 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-v6rfm" event={"ID":"90afa0e0-1dc5-441a-a8f9-5f26b53ebe34","Type":"ContainerStarted","Data":"acaeb39b7d1e561c11225d2a2e81f2ed1057e9c63d2e0fb73af00285d0d6677a"} Jan 31 09:38:53 crc kubenswrapper[4992]: I0131 09:38:53.377284 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54465874f9-hw7pd" podStartSLOduration=1.377263991 podStartE2EDuration="1.377263991s" podCreationTimestamp="2026-01-31 09:38:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:38:53.373987266 +0000 UTC m=+829.345379253" watchObservedRunningTime="2026-01-31 09:38:53.377263991 +0000 UTC m=+829.348655988" Jan 31 09:38:54 crc kubenswrapper[4992]: I0131 09:38:54.429043 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:54 crc kubenswrapper[4992]: I0131 09:38:54.478857 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:54 crc kubenswrapper[4992]: I0131 09:38:54.657673 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rwlfm"] Jan 31 09:38:55 crc kubenswrapper[4992]: I0131 09:38:55.372456 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87" event={"ID":"aa44611e-4b2f-4d88-bc6f-04146843bbae","Type":"ContainerStarted","Data":"d4ded2a2ec743177225ced541c2d986ec2a89e3e3e5b84f5ef85c6d1c5385b91"} Jan 31 09:38:55 crc kubenswrapper[4992]: I0131 09:38:55.372791 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87" Jan 31 09:38:55 crc kubenswrapper[4992]: I0131 09:38:55.377078 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-sd59f" event={"ID":"9e2f9f80-b7a6-4a51-b481-723b3b0daad7","Type":"ContainerStarted","Data":"4c03692cca4d80b3e70f4d7ecdc7fd5473a96efb567b3c9389bbf1a87b2f291c"} Jan 31 09:38:55 crc kubenswrapper[4992]: I0131 09:38:55.389582 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87" podStartSLOduration=1.8486536999999998 podStartE2EDuration="4.389564349s" podCreationTimestamp="2026-01-31 09:38:51 +0000 UTC" firstStartedPulling="2026-01-31 09:38:52.583115855 +0000 UTC m=+828.554507842" lastFinishedPulling="2026-01-31 09:38:55.124026494 +0000 UTC m=+831.095418491" observedRunningTime="2026-01-31 09:38:55.388626852 +0000 UTC m=+831.360018839" watchObservedRunningTime="2026-01-31 09:38:55.389564349 +0000 UTC m=+831.360956336" Jan 31 09:38:56 crc kubenswrapper[4992]: I0131 09:38:56.384165 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-v6rfm" event={"ID":"90afa0e0-1dc5-441a-a8f9-5f26b53ebe34","Type":"ContainerStarted","Data":"59fb2d74ab8d7c273fc779b08037af717800cb48cfe2b86393c4d7292fe39e50"} Jan 31 09:38:56 crc kubenswrapper[4992]: I0131 09:38:56.384318 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rwlfm" podUID="6a3bf95b-592e-4d16-996e-e57175b19b28" containerName="registry-server" containerID="cri-o://c12b5eda50a6f026dd2a2cc91b13dc1e0f56331382649bb5211a190680398c5a" gracePeriod=2 Jan 31 09:38:56 crc kubenswrapper[4992]: I0131 09:38:56.385353 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:38:56 crc kubenswrapper[4992]: I0131 09:38:56.415940 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-v6rfm" podStartSLOduration=2.622998113 podStartE2EDuration="5.415919217s" podCreationTimestamp="2026-01-31 09:38:51 +0000 UTC" firstStartedPulling="2026-01-31 09:38:52.355709281 +0000 UTC m=+828.327101268" lastFinishedPulling="2026-01-31 09:38:55.148630375 +0000 UTC m=+831.120022372" observedRunningTime="2026-01-31 09:38:56.414093224 +0000 UTC m=+832.385485211" watchObservedRunningTime="2026-01-31 09:38:56.415919217 +0000 UTC m=+832.387311214" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.185876 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.214933 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-864hr\" (UniqueName: \"kubernetes.io/projected/6a3bf95b-592e-4d16-996e-e57175b19b28-kube-api-access-864hr\") pod \"6a3bf95b-592e-4d16-996e-e57175b19b28\" (UID: \"6a3bf95b-592e-4d16-996e-e57175b19b28\") " Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.215608 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3bf95b-592e-4d16-996e-e57175b19b28-catalog-content\") pod \"6a3bf95b-592e-4d16-996e-e57175b19b28\" (UID: \"6a3bf95b-592e-4d16-996e-e57175b19b28\") " Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.215722 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3bf95b-592e-4d16-996e-e57175b19b28-utilities\") pod \"6a3bf95b-592e-4d16-996e-e57175b19b28\" (UID: \"6a3bf95b-592e-4d16-996e-e57175b19b28\") " Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.216547 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3bf95b-592e-4d16-996e-e57175b19b28-utilities" (OuterVolumeSpecName: "utilities") pod "6a3bf95b-592e-4d16-996e-e57175b19b28" (UID: "6a3bf95b-592e-4d16-996e-e57175b19b28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.221657 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3bf95b-592e-4d16-996e-e57175b19b28-kube-api-access-864hr" (OuterVolumeSpecName: "kube-api-access-864hr") pod "6a3bf95b-592e-4d16-996e-e57175b19b28" (UID: "6a3bf95b-592e-4d16-996e-e57175b19b28"). InnerVolumeSpecName "kube-api-access-864hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.317642 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3bf95b-592e-4d16-996e-e57175b19b28-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.317674 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-864hr\" (UniqueName: \"kubernetes.io/projected/6a3bf95b-592e-4d16-996e-e57175b19b28-kube-api-access-864hr\") on node \"crc\" DevicePath \"\"" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.379124 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3bf95b-592e-4d16-996e-e57175b19b28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a3bf95b-592e-4d16-996e-e57175b19b28" (UID: "6a3bf95b-592e-4d16-996e-e57175b19b28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.395105 4992 generic.go:334] "Generic (PLEG): container finished" podID="6a3bf95b-592e-4d16-996e-e57175b19b28" containerID="c12b5eda50a6f026dd2a2cc91b13dc1e0f56331382649bb5211a190680398c5a" exitCode=0 Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.395730 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rwlfm" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.396583 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rwlfm" event={"ID":"6a3bf95b-592e-4d16-996e-e57175b19b28","Type":"ContainerDied","Data":"c12b5eda50a6f026dd2a2cc91b13dc1e0f56331382649bb5211a190680398c5a"} Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.396628 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rwlfm" event={"ID":"6a3bf95b-592e-4d16-996e-e57175b19b28","Type":"ContainerDied","Data":"8481f6d0e3630030af23c417f676706961da7f3a47aadddcf76c14492db5d83b"} Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.396648 4992 scope.go:117] "RemoveContainer" containerID="c12b5eda50a6f026dd2a2cc91b13dc1e0f56331382649bb5211a190680398c5a" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.420994 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3bf95b-592e-4d16-996e-e57175b19b28-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.428749 4992 scope.go:117] "RemoveContainer" containerID="c3cea06038dcb02b85843df57a523cafc85b86082b0f653b2e0fedbcc5758adb" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.450680 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rwlfm"] Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.457943 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rwlfm"] Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.464228 4992 scope.go:117] "RemoveContainer" containerID="9ddb6574b80800af3efd41363c4d8f7d332d49e77a4a9b3548c29f1bc02da4aa" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.479027 4992 scope.go:117] "RemoveContainer" containerID="c12b5eda50a6f026dd2a2cc91b13dc1e0f56331382649bb5211a190680398c5a" Jan 31 09:38:57 crc kubenswrapper[4992]: E0131 09:38:57.479868 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c12b5eda50a6f026dd2a2cc91b13dc1e0f56331382649bb5211a190680398c5a\": container with ID starting with c12b5eda50a6f026dd2a2cc91b13dc1e0f56331382649bb5211a190680398c5a not found: ID does not exist" containerID="c12b5eda50a6f026dd2a2cc91b13dc1e0f56331382649bb5211a190680398c5a" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.479921 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c12b5eda50a6f026dd2a2cc91b13dc1e0f56331382649bb5211a190680398c5a"} err="failed to get container status \"c12b5eda50a6f026dd2a2cc91b13dc1e0f56331382649bb5211a190680398c5a\": rpc error: code = NotFound desc = could not find container \"c12b5eda50a6f026dd2a2cc91b13dc1e0f56331382649bb5211a190680398c5a\": container with ID starting with c12b5eda50a6f026dd2a2cc91b13dc1e0f56331382649bb5211a190680398c5a not found: ID does not exist" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.479954 4992 scope.go:117] "RemoveContainer" containerID="c3cea06038dcb02b85843df57a523cafc85b86082b0f653b2e0fedbcc5758adb" Jan 31 09:38:57 crc kubenswrapper[4992]: E0131 09:38:57.480247 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3cea06038dcb02b85843df57a523cafc85b86082b0f653b2e0fedbcc5758adb\": container with ID starting with c3cea06038dcb02b85843df57a523cafc85b86082b0f653b2e0fedbcc5758adb not found: ID does not exist" containerID="c3cea06038dcb02b85843df57a523cafc85b86082b0f653b2e0fedbcc5758adb" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.480358 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cea06038dcb02b85843df57a523cafc85b86082b0f653b2e0fedbcc5758adb"} err="failed to get container status \"c3cea06038dcb02b85843df57a523cafc85b86082b0f653b2e0fedbcc5758adb\": rpc error: code = NotFound desc = could not find container \"c3cea06038dcb02b85843df57a523cafc85b86082b0f653b2e0fedbcc5758adb\": container with ID starting with c3cea06038dcb02b85843df57a523cafc85b86082b0f653b2e0fedbcc5758adb not found: ID does not exist" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.480550 4992 scope.go:117] "RemoveContainer" containerID="9ddb6574b80800af3efd41363c4d8f7d332d49e77a4a9b3548c29f1bc02da4aa" Jan 31 09:38:57 crc kubenswrapper[4992]: E0131 09:38:57.480931 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ddb6574b80800af3efd41363c4d8f7d332d49e77a4a9b3548c29f1bc02da4aa\": container with ID starting with 9ddb6574b80800af3efd41363c4d8f7d332d49e77a4a9b3548c29f1bc02da4aa not found: ID does not exist" containerID="9ddb6574b80800af3efd41363c4d8f7d332d49e77a4a9b3548c29f1bc02da4aa" Jan 31 09:38:57 crc kubenswrapper[4992]: I0131 09:38:57.480983 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ddb6574b80800af3efd41363c4d8f7d332d49e77a4a9b3548c29f1bc02da4aa"} err="failed to get container status \"9ddb6574b80800af3efd41363c4d8f7d332d49e77a4a9b3548c29f1bc02da4aa\": rpc error: code = NotFound desc = could not find container \"9ddb6574b80800af3efd41363c4d8f7d332d49e77a4a9b3548c29f1bc02da4aa\": container with ID starting with 9ddb6574b80800af3efd41363c4d8f7d332d49e77a4a9b3548c29f1bc02da4aa not found: ID does not exist" Jan 31 09:38:58 crc kubenswrapper[4992]: I0131 09:38:58.414764 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw" event={"ID":"ae4cff7e-7c67-4840-8b78-ca21eb4e1abf","Type":"ContainerStarted","Data":"9780d6698a25a29b1f7e06eb0ea116b10c25c7e44d98806c03992b0cc4419120"} Jan 31 09:38:58 crc kubenswrapper[4992]: I0131 09:38:58.438596 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-d4xvw" podStartSLOduration=1.838815423 podStartE2EDuration="6.438570345s" podCreationTimestamp="2026-01-31 09:38:52 +0000 UTC" firstStartedPulling="2026-01-31 09:38:53.227715378 +0000 UTC m=+829.199107365" lastFinishedPulling="2026-01-31 09:38:57.8274703 +0000 UTC m=+833.798862287" observedRunningTime="2026-01-31 09:38:58.431057187 +0000 UTC m=+834.402449194" watchObservedRunningTime="2026-01-31 09:38:58.438570345 +0000 UTC m=+834.409962342" Jan 31 09:38:59 crc kubenswrapper[4992]: I0131 09:38:59.201792 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a3bf95b-592e-4d16-996e-e57175b19b28" path="/var/lib/kubelet/pods/6a3bf95b-592e-4d16-996e-e57175b19b28/volumes" Jan 31 09:38:59 crc kubenswrapper[4992]: I0131 09:38:59.428028 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-sd59f" event={"ID":"9e2f9f80-b7a6-4a51-b481-723b3b0daad7","Type":"ContainerStarted","Data":"e1894807b823caea6a04ab79f6a5526712f0766adbfe35a20a5dd4baedbefeaa"} Jan 31 09:38:59 crc kubenswrapper[4992]: I0131 09:38:59.451030 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-sd59f" podStartSLOduration=1.968919517 podStartE2EDuration="8.45101292s" podCreationTimestamp="2026-01-31 09:38:51 +0000 UTC" firstStartedPulling="2026-01-31 09:38:52.524992905 +0000 UTC m=+828.496384892" lastFinishedPulling="2026-01-31 09:38:59.007086308 +0000 UTC m=+834.978478295" observedRunningTime="2026-01-31 09:38:59.44962895 +0000 UTC m=+835.421020967" watchObservedRunningTime="2026-01-31 09:38:59.45101292 +0000 UTC m=+835.422404897" Jan 31 09:39:02 crc kubenswrapper[4992]: I0131 09:39:02.307809 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-v6rfm" Jan 31 09:39:02 crc kubenswrapper[4992]: I0131 09:39:02.660812 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:39:02 crc kubenswrapper[4992]: I0131 09:39:02.660888 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:39:02 crc kubenswrapper[4992]: I0131 09:39:02.665792 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:39:03 crc kubenswrapper[4992]: I0131 09:39:03.451754 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54465874f9-hw7pd" Jan 31 09:39:03 crc kubenswrapper[4992]: I0131 09:39:03.524883 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7bjlw"] Jan 31 09:39:12 crc kubenswrapper[4992]: I0131 09:39:12.322924 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-fhb87" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.410874 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p"] Jan 31 09:39:25 crc kubenswrapper[4992]: E0131 09:39:25.411713 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3bf95b-592e-4d16-996e-e57175b19b28" containerName="extract-content" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.411730 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3bf95b-592e-4d16-996e-e57175b19b28" containerName="extract-content" Jan 31 09:39:25 crc kubenswrapper[4992]: E0131 09:39:25.411752 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3bf95b-592e-4d16-996e-e57175b19b28" containerName="extract-utilities" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.411760 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3bf95b-592e-4d16-996e-e57175b19b28" containerName="extract-utilities" Jan 31 09:39:25 crc kubenswrapper[4992]: E0131 09:39:25.411770 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3bf95b-592e-4d16-996e-e57175b19b28" containerName="registry-server" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.411778 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3bf95b-592e-4d16-996e-e57175b19b28" containerName="registry-server" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.411917 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a3bf95b-592e-4d16-996e-e57175b19b28" containerName="registry-server" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.412846 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.415368 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.422522 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p"] Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.531821 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cngz\" (UniqueName: \"kubernetes.io/projected/9f78434a-da65-45e0-ae70-54b461c9e408-kube-api-access-6cngz\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p\" (UID: \"9f78434a-da65-45e0-ae70-54b461c9e408\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.531912 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f78434a-da65-45e0-ae70-54b461c9e408-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p\" (UID: \"9f78434a-da65-45e0-ae70-54b461c9e408\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.532152 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f78434a-da65-45e0-ae70-54b461c9e408-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p\" (UID: \"9f78434a-da65-45e0-ae70-54b461c9e408\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.633860 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f78434a-da65-45e0-ae70-54b461c9e408-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p\" (UID: \"9f78434a-da65-45e0-ae70-54b461c9e408\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.633948 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f78434a-da65-45e0-ae70-54b461c9e408-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p\" (UID: \"9f78434a-da65-45e0-ae70-54b461c9e408\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.633975 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cngz\" (UniqueName: \"kubernetes.io/projected/9f78434a-da65-45e0-ae70-54b461c9e408-kube-api-access-6cngz\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p\" (UID: \"9f78434a-da65-45e0-ae70-54b461c9e408\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.634713 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f78434a-da65-45e0-ae70-54b461c9e408-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p\" (UID: \"9f78434a-da65-45e0-ae70-54b461c9e408\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.635642 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f78434a-da65-45e0-ae70-54b461c9e408-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p\" (UID: \"9f78434a-da65-45e0-ae70-54b461c9e408\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.653035 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cngz\" (UniqueName: \"kubernetes.io/projected/9f78434a-da65-45e0-ae70-54b461c9e408-kube-api-access-6cngz\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p\" (UID: \"9f78434a-da65-45e0-ae70-54b461c9e408\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" Jan 31 09:39:25 crc kubenswrapper[4992]: I0131 09:39:25.733379 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" Jan 31 09:39:26 crc kubenswrapper[4992]: I0131 09:39:26.128018 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p"] Jan 31 09:39:26 crc kubenswrapper[4992]: W0131 09:39:26.137931 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f78434a_da65_45e0_ae70_54b461c9e408.slice/crio-c1858a96e04742d7cc79e97a2c6b4cdbd79ed5f1dd755560b314dc83b8c92a87 WatchSource:0}: Error finding container c1858a96e04742d7cc79e97a2c6b4cdbd79ed5f1dd755560b314dc83b8c92a87: Status 404 returned error can't find the container with id c1858a96e04742d7cc79e97a2c6b4cdbd79ed5f1dd755560b314dc83b8c92a87 Jan 31 09:39:26 crc kubenswrapper[4992]: I0131 09:39:26.573536 4992 generic.go:334] "Generic (PLEG): container finished" podID="9f78434a-da65-45e0-ae70-54b461c9e408" containerID="a1490e3052ef57a80415062062054cb99c14402ebef7b8575dbdce40c8c136c3" exitCode=0 Jan 31 09:39:26 crc kubenswrapper[4992]: I0131 09:39:26.573838 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" event={"ID":"9f78434a-da65-45e0-ae70-54b461c9e408","Type":"ContainerDied","Data":"a1490e3052ef57a80415062062054cb99c14402ebef7b8575dbdce40c8c136c3"} Jan 31 09:39:26 crc kubenswrapper[4992]: I0131 09:39:26.573868 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" event={"ID":"9f78434a-da65-45e0-ae70-54b461c9e408","Type":"ContainerStarted","Data":"c1858a96e04742d7cc79e97a2c6b4cdbd79ed5f1dd755560b314dc83b8c92a87"} Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.563571 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-7bjlw" podUID="ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" containerName="console" containerID="cri-o://7a6a719b97d93ae745ef222ca7c1cafccb483e3f059bc60fc1cd386ff87cc5ae" gracePeriod=15 Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.629557 4992 generic.go:334] "Generic (PLEG): container finished" podID="9f78434a-da65-45e0-ae70-54b461c9e408" containerID="1b4be17396a013528c642eb48ac3eb243bd1a5b3a1d4f5736531d37f85e3f5e9" exitCode=0 Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.629611 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" event={"ID":"9f78434a-da65-45e0-ae70-54b461c9e408","Type":"ContainerDied","Data":"1b4be17396a013528c642eb48ac3eb243bd1a5b3a1d4f5736531d37f85e3f5e9"} Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.888000 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7bjlw_ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6/console/0.log" Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.888239 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.982480 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-oauth-serving-cert\") pod \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.982548 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-service-ca\") pod \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.982636 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-config\") pod \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.982662 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-serving-cert\") pod \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.982694 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-trusted-ca-bundle\") pod \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.982716 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd4qw\" (UniqueName: \"kubernetes.io/projected/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-kube-api-access-cd4qw\") pod \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.982742 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-oauth-config\") pod \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\" (UID: \"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6\") " Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.983357 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-config" (OuterVolumeSpecName: "console-config") pod "ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" (UID: "ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.983428 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" (UID: "ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.983682 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" (UID: "ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.983974 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-service-ca" (OuterVolumeSpecName: "service-ca") pod "ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" (UID: "ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.988861 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" (UID: "ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.989745 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" (UID: "ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:39:28 crc kubenswrapper[4992]: I0131 09:39:28.990112 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-kube-api-access-cd4qw" (OuterVolumeSpecName: "kube-api-access-cd4qw") pod "ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" (UID: "ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6"). InnerVolumeSpecName "kube-api-access-cd4qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.084008 4992 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.084051 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd4qw\" (UniqueName: \"kubernetes.io/projected/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-kube-api-access-cd4qw\") on node \"crc\" DevicePath \"\"" Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.084066 4992 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.084077 4992 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.084088 4992 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.084098 4992 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.084111 4992 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.635218 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7bjlw_ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6/console/0.log" Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.635279 4992 generic.go:334] "Generic (PLEG): container finished" podID="ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" containerID="7a6a719b97d93ae745ef222ca7c1cafccb483e3f059bc60fc1cd386ff87cc5ae" exitCode=2 Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.635336 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7bjlw" event={"ID":"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6","Type":"ContainerDied","Data":"7a6a719b97d93ae745ef222ca7c1cafccb483e3f059bc60fc1cd386ff87cc5ae"} Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.635348 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7bjlw" Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.635371 4992 scope.go:117] "RemoveContainer" containerID="7a6a719b97d93ae745ef222ca7c1cafccb483e3f059bc60fc1cd386ff87cc5ae" Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.635360 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7bjlw" event={"ID":"ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6","Type":"ContainerDied","Data":"39c2143b93c2c591d93b4302609f1c2408c48d383e898a9a16f4b1c67731da1b"} Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.637645 4992 generic.go:334] "Generic (PLEG): container finished" podID="9f78434a-da65-45e0-ae70-54b461c9e408" containerID="e0fb2e2b3fc928dee7742b8074683fa82081602f20f055ac5c62ea1852c4148d" exitCode=0 Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.637701 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" event={"ID":"9f78434a-da65-45e0-ae70-54b461c9e408","Type":"ContainerDied","Data":"e0fb2e2b3fc928dee7742b8074683fa82081602f20f055ac5c62ea1852c4148d"} Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.650631 4992 scope.go:117] "RemoveContainer" containerID="7a6a719b97d93ae745ef222ca7c1cafccb483e3f059bc60fc1cd386ff87cc5ae" Jan 31 09:39:29 crc kubenswrapper[4992]: E0131 09:39:29.651074 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a6a719b97d93ae745ef222ca7c1cafccb483e3f059bc60fc1cd386ff87cc5ae\": container with ID starting with 7a6a719b97d93ae745ef222ca7c1cafccb483e3f059bc60fc1cd386ff87cc5ae not found: ID does not exist" containerID="7a6a719b97d93ae745ef222ca7c1cafccb483e3f059bc60fc1cd386ff87cc5ae" Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.651112 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a6a719b97d93ae745ef222ca7c1cafccb483e3f059bc60fc1cd386ff87cc5ae"} err="failed to get container status \"7a6a719b97d93ae745ef222ca7c1cafccb483e3f059bc60fc1cd386ff87cc5ae\": rpc error: code = NotFound desc = could not find container \"7a6a719b97d93ae745ef222ca7c1cafccb483e3f059bc60fc1cd386ff87cc5ae\": container with ID starting with 7a6a719b97d93ae745ef222ca7c1cafccb483e3f059bc60fc1cd386ff87cc5ae not found: ID does not exist" Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.655130 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7bjlw"] Jan 31 09:39:29 crc kubenswrapper[4992]: I0131 09:39:29.659233 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-7bjlw"] Jan 31 09:39:30 crc kubenswrapper[4992]: I0131 09:39:30.919786 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" Jan 31 09:39:31 crc kubenswrapper[4992]: I0131 09:39:31.007545 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cngz\" (UniqueName: \"kubernetes.io/projected/9f78434a-da65-45e0-ae70-54b461c9e408-kube-api-access-6cngz\") pod \"9f78434a-da65-45e0-ae70-54b461c9e408\" (UID: \"9f78434a-da65-45e0-ae70-54b461c9e408\") " Jan 31 09:39:31 crc kubenswrapper[4992]: I0131 09:39:31.007662 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f78434a-da65-45e0-ae70-54b461c9e408-util\") pod \"9f78434a-da65-45e0-ae70-54b461c9e408\" (UID: \"9f78434a-da65-45e0-ae70-54b461c9e408\") " Jan 31 09:39:31 crc kubenswrapper[4992]: I0131 09:39:31.007719 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f78434a-da65-45e0-ae70-54b461c9e408-bundle\") pod \"9f78434a-da65-45e0-ae70-54b461c9e408\" (UID: \"9f78434a-da65-45e0-ae70-54b461c9e408\") " Jan 31 09:39:31 crc kubenswrapper[4992]: I0131 09:39:31.008733 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f78434a-da65-45e0-ae70-54b461c9e408-bundle" (OuterVolumeSpecName: "bundle") pod "9f78434a-da65-45e0-ae70-54b461c9e408" (UID: "9f78434a-da65-45e0-ae70-54b461c9e408"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:39:31 crc kubenswrapper[4992]: I0131 09:39:31.018279 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f78434a-da65-45e0-ae70-54b461c9e408-kube-api-access-6cngz" (OuterVolumeSpecName: "kube-api-access-6cngz") pod "9f78434a-da65-45e0-ae70-54b461c9e408" (UID: "9f78434a-da65-45e0-ae70-54b461c9e408"). InnerVolumeSpecName "kube-api-access-6cngz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:39:31 crc kubenswrapper[4992]: I0131 09:39:31.022625 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f78434a-da65-45e0-ae70-54b461c9e408-util" (OuterVolumeSpecName: "util") pod "9f78434a-da65-45e0-ae70-54b461c9e408" (UID: "9f78434a-da65-45e0-ae70-54b461c9e408"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:39:31 crc kubenswrapper[4992]: I0131 09:39:31.109622 4992 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f78434a-da65-45e0-ae70-54b461c9e408-util\") on node \"crc\" DevicePath \"\"" Jan 31 09:39:31 crc kubenswrapper[4992]: I0131 09:39:31.109669 4992 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f78434a-da65-45e0-ae70-54b461c9e408-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:39:31 crc kubenswrapper[4992]: I0131 09:39:31.109684 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cngz\" (UniqueName: \"kubernetes.io/projected/9f78434a-da65-45e0-ae70-54b461c9e408-kube-api-access-6cngz\") on node \"crc\" DevicePath \"\"" Jan 31 09:39:31 crc kubenswrapper[4992]: I0131 09:39:31.189134 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" path="/var/lib/kubelet/pods/ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6/volumes" Jan 31 09:39:31 crc kubenswrapper[4992]: I0131 09:39:31.654667 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" event={"ID":"9f78434a-da65-45e0-ae70-54b461c9e408","Type":"ContainerDied","Data":"c1858a96e04742d7cc79e97a2c6b4cdbd79ed5f1dd755560b314dc83b8c92a87"} Jan 31 09:39:31 crc kubenswrapper[4992]: I0131 09:39:31.654705 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1858a96e04742d7cc79e97a2c6b4cdbd79ed5f1dd755560b314dc83b8c92a87" Jan 31 09:39:31 crc kubenswrapper[4992]: I0131 09:39:31.654778 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.076276 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr"] Jan 31 09:39:41 crc kubenswrapper[4992]: E0131 09:39:41.076872 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f78434a-da65-45e0-ae70-54b461c9e408" containerName="util" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.076883 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f78434a-da65-45e0-ae70-54b461c9e408" containerName="util" Jan 31 09:39:41 crc kubenswrapper[4992]: E0131 09:39:41.076893 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f78434a-da65-45e0-ae70-54b461c9e408" containerName="pull" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.076900 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f78434a-da65-45e0-ae70-54b461c9e408" containerName="pull" Jan 31 09:39:41 crc kubenswrapper[4992]: E0131 09:39:41.076906 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" containerName="console" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.076912 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" containerName="console" Jan 31 09:39:41 crc kubenswrapper[4992]: E0131 09:39:41.076922 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f78434a-da65-45e0-ae70-54b461c9e408" containerName="extract" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.076927 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f78434a-da65-45e0-ae70-54b461c9e408" containerName="extract" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.077014 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f78434a-da65-45e0-ae70-54b461c9e408" containerName="extract" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.077023 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec3d6869-d3b5-4ccf-9d67-4ab765f3a7b6" containerName="console" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.077393 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.081579 4992 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.082326 4992 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-rj6gx" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.082340 4992 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.083299 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.083763 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.096112 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr"] Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.137610 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkz4j\" (UniqueName: \"kubernetes.io/projected/06319871-3a38-41ed-966a-4fb2fc393b6e-kube-api-access-gkz4j\") pod \"metallb-operator-controller-manager-7f5cfb8dfd-ks4dr\" (UID: \"06319871-3a38-41ed-966a-4fb2fc393b6e\") " pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.137683 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06319871-3a38-41ed-966a-4fb2fc393b6e-webhook-cert\") pod \"metallb-operator-controller-manager-7f5cfb8dfd-ks4dr\" (UID: \"06319871-3a38-41ed-966a-4fb2fc393b6e\") " pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.137720 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06319871-3a38-41ed-966a-4fb2fc393b6e-apiservice-cert\") pod \"metallb-operator-controller-manager-7f5cfb8dfd-ks4dr\" (UID: \"06319871-3a38-41ed-966a-4fb2fc393b6e\") " pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.238438 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06319871-3a38-41ed-966a-4fb2fc393b6e-webhook-cert\") pod \"metallb-operator-controller-manager-7f5cfb8dfd-ks4dr\" (UID: \"06319871-3a38-41ed-966a-4fb2fc393b6e\") " pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.238488 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06319871-3a38-41ed-966a-4fb2fc393b6e-apiservice-cert\") pod \"metallb-operator-controller-manager-7f5cfb8dfd-ks4dr\" (UID: \"06319871-3a38-41ed-966a-4fb2fc393b6e\") " pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.238547 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkz4j\" (UniqueName: \"kubernetes.io/projected/06319871-3a38-41ed-966a-4fb2fc393b6e-kube-api-access-gkz4j\") pod \"metallb-operator-controller-manager-7f5cfb8dfd-ks4dr\" (UID: \"06319871-3a38-41ed-966a-4fb2fc393b6e\") " pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.246157 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06319871-3a38-41ed-966a-4fb2fc393b6e-webhook-cert\") pod \"metallb-operator-controller-manager-7f5cfb8dfd-ks4dr\" (UID: \"06319871-3a38-41ed-966a-4fb2fc393b6e\") " pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.253071 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06319871-3a38-41ed-966a-4fb2fc393b6e-apiservice-cert\") pod \"metallb-operator-controller-manager-7f5cfb8dfd-ks4dr\" (UID: \"06319871-3a38-41ed-966a-4fb2fc393b6e\") " pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.265594 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkz4j\" (UniqueName: \"kubernetes.io/projected/06319871-3a38-41ed-966a-4fb2fc393b6e-kube-api-access-gkz4j\") pod \"metallb-operator-controller-manager-7f5cfb8dfd-ks4dr\" (UID: \"06319871-3a38-41ed-966a-4fb2fc393b6e\") " pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.341349 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f"] Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.342127 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.344812 4992 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-5rlpw" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.344886 4992 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.344970 4992 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.355490 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f"] Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.394832 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.441796 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41cfef2d-5a07-47e2-88d5-62a2f468029e-webhook-cert\") pod \"metallb-operator-webhook-server-6ff564f44c-br64f\" (UID: \"41cfef2d-5a07-47e2-88d5-62a2f468029e\") " pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.441859 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41cfef2d-5a07-47e2-88d5-62a2f468029e-apiservice-cert\") pod \"metallb-operator-webhook-server-6ff564f44c-br64f\" (UID: \"41cfef2d-5a07-47e2-88d5-62a2f468029e\") " pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.441937 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5xnt\" (UniqueName: \"kubernetes.io/projected/41cfef2d-5a07-47e2-88d5-62a2f468029e-kube-api-access-s5xnt\") pod \"metallb-operator-webhook-server-6ff564f44c-br64f\" (UID: \"41cfef2d-5a07-47e2-88d5-62a2f468029e\") " pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.543722 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41cfef2d-5a07-47e2-88d5-62a2f468029e-webhook-cert\") pod \"metallb-operator-webhook-server-6ff564f44c-br64f\" (UID: \"41cfef2d-5a07-47e2-88d5-62a2f468029e\") " pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.543948 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41cfef2d-5a07-47e2-88d5-62a2f468029e-apiservice-cert\") pod \"metallb-operator-webhook-server-6ff564f44c-br64f\" (UID: \"41cfef2d-5a07-47e2-88d5-62a2f468029e\") " pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.544008 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5xnt\" (UniqueName: \"kubernetes.io/projected/41cfef2d-5a07-47e2-88d5-62a2f468029e-kube-api-access-s5xnt\") pod \"metallb-operator-webhook-server-6ff564f44c-br64f\" (UID: \"41cfef2d-5a07-47e2-88d5-62a2f468029e\") " pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.558696 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41cfef2d-5a07-47e2-88d5-62a2f468029e-apiservice-cert\") pod \"metallb-operator-webhook-server-6ff564f44c-br64f\" (UID: \"41cfef2d-5a07-47e2-88d5-62a2f468029e\") " pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.559188 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41cfef2d-5a07-47e2-88d5-62a2f468029e-webhook-cert\") pod \"metallb-operator-webhook-server-6ff564f44c-br64f\" (UID: \"41cfef2d-5a07-47e2-88d5-62a2f468029e\") " pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.574105 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5xnt\" (UniqueName: \"kubernetes.io/projected/41cfef2d-5a07-47e2-88d5-62a2f468029e-kube-api-access-s5xnt\") pod \"metallb-operator-webhook-server-6ff564f44c-br64f\" (UID: \"41cfef2d-5a07-47e2-88d5-62a2f468029e\") " pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.661651 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.661922 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr"] Jan 31 09:39:41 crc kubenswrapper[4992]: I0131 09:39:41.881845 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f"] Jan 31 09:39:41 crc kubenswrapper[4992]: W0131 09:39:41.888380 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41cfef2d_5a07_47e2_88d5_62a2f468029e.slice/crio-7755d9aeacda9fa61b2c08c32046ca3dd2336202c231d33e1c6f68e9ce62db31 WatchSource:0}: Error finding container 7755d9aeacda9fa61b2c08c32046ca3dd2336202c231d33e1c6f68e9ce62db31: Status 404 returned error can't find the container with id 7755d9aeacda9fa61b2c08c32046ca3dd2336202c231d33e1c6f68e9ce62db31 Jan 31 09:39:42 crc kubenswrapper[4992]: I0131 09:39:42.717666 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" event={"ID":"06319871-3a38-41ed-966a-4fb2fc393b6e","Type":"ContainerStarted","Data":"dd6184939952a59af7378d3291faee4b4c8ffe24e1c2a9594f0c49e5d2c1cd33"} Jan 31 09:39:42 crc kubenswrapper[4992]: I0131 09:39:42.718724 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" event={"ID":"41cfef2d-5a07-47e2-88d5-62a2f468029e","Type":"ContainerStarted","Data":"7755d9aeacda9fa61b2c08c32046ca3dd2336202c231d33e1c6f68e9ce62db31"} Jan 31 09:39:49 crc kubenswrapper[4992]: I0131 09:39:49.761290 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" event={"ID":"41cfef2d-5a07-47e2-88d5-62a2f468029e","Type":"ContainerStarted","Data":"55bac500f8929d5ff66ada7bd0df67b819c8a456cc1afe3c12132fd13330fbc6"} Jan 31 09:39:49 crc kubenswrapper[4992]: I0131 09:39:49.761828 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" Jan 31 09:39:49 crc kubenswrapper[4992]: I0131 09:39:49.765214 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" event={"ID":"06319871-3a38-41ed-966a-4fb2fc393b6e","Type":"ContainerStarted","Data":"2a33fc3daf54282b3d3888d60b8121043c94b34815bdd341b0a1a8cd11b73ca2"} Jan 31 09:39:49 crc kubenswrapper[4992]: I0131 09:39:49.765362 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" Jan 31 09:39:49 crc kubenswrapper[4992]: I0131 09:39:49.777948 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" podStartSLOduration=1.643237584 podStartE2EDuration="8.777929692s" podCreationTimestamp="2026-01-31 09:39:41 +0000 UTC" firstStartedPulling="2026-01-31 09:39:41.891271931 +0000 UTC m=+877.862663918" lastFinishedPulling="2026-01-31 09:39:49.025964039 +0000 UTC m=+884.997356026" observedRunningTime="2026-01-31 09:39:49.77682576 +0000 UTC m=+885.748217767" watchObservedRunningTime="2026-01-31 09:39:49.777929692 +0000 UTC m=+885.749321699" Jan 31 09:39:49 crc kubenswrapper[4992]: I0131 09:39:49.802973 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" podStartSLOduration=1.56458016 podStartE2EDuration="8.802947385s" podCreationTimestamp="2026-01-31 09:39:41 +0000 UTC" firstStartedPulling="2026-01-31 09:39:41.769604664 +0000 UTC m=+877.740996661" lastFinishedPulling="2026-01-31 09:39:49.007971899 +0000 UTC m=+884.979363886" observedRunningTime="2026-01-31 09:39:49.799506506 +0000 UTC m=+885.770898543" watchObservedRunningTime="2026-01-31 09:39:49.802947385 +0000 UTC m=+885.774339382" Jan 31 09:40:01 crc kubenswrapper[4992]: I0131 09:40:01.669879 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6ff564f44c-br64f" Jan 31 09:40:05 crc kubenswrapper[4992]: I0131 09:40:05.040406 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zc724"] Jan 31 09:40:05 crc kubenswrapper[4992]: I0131 09:40:05.044174 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:05 crc kubenswrapper[4992]: I0131 09:40:05.052760 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zc724"] Jan 31 09:40:05 crc kubenswrapper[4992]: I0131 09:40:05.162740 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhzvh\" (UniqueName: \"kubernetes.io/projected/14d5da0b-0b92-4836-9bd8-d04761d1a160-kube-api-access-hhzvh\") pod \"certified-operators-zc724\" (UID: \"14d5da0b-0b92-4836-9bd8-d04761d1a160\") " pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:05 crc kubenswrapper[4992]: I0131 09:40:05.162787 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14d5da0b-0b92-4836-9bd8-d04761d1a160-catalog-content\") pod \"certified-operators-zc724\" (UID: \"14d5da0b-0b92-4836-9bd8-d04761d1a160\") " pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:05 crc kubenswrapper[4992]: I0131 09:40:05.162830 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14d5da0b-0b92-4836-9bd8-d04761d1a160-utilities\") pod \"certified-operators-zc724\" (UID: \"14d5da0b-0b92-4836-9bd8-d04761d1a160\") " pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:05 crc kubenswrapper[4992]: I0131 09:40:05.264014 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhzvh\" (UniqueName: \"kubernetes.io/projected/14d5da0b-0b92-4836-9bd8-d04761d1a160-kube-api-access-hhzvh\") pod \"certified-operators-zc724\" (UID: \"14d5da0b-0b92-4836-9bd8-d04761d1a160\") " pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:05 crc kubenswrapper[4992]: I0131 09:40:05.264075 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14d5da0b-0b92-4836-9bd8-d04761d1a160-catalog-content\") pod \"certified-operators-zc724\" (UID: \"14d5da0b-0b92-4836-9bd8-d04761d1a160\") " pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:05 crc kubenswrapper[4992]: I0131 09:40:05.264138 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14d5da0b-0b92-4836-9bd8-d04761d1a160-utilities\") pod \"certified-operators-zc724\" (UID: \"14d5da0b-0b92-4836-9bd8-d04761d1a160\") " pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:05 crc kubenswrapper[4992]: I0131 09:40:05.264745 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14d5da0b-0b92-4836-9bd8-d04761d1a160-catalog-content\") pod \"certified-operators-zc724\" (UID: \"14d5da0b-0b92-4836-9bd8-d04761d1a160\") " pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:05 crc kubenswrapper[4992]: I0131 09:40:05.264826 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14d5da0b-0b92-4836-9bd8-d04761d1a160-utilities\") pod \"certified-operators-zc724\" (UID: \"14d5da0b-0b92-4836-9bd8-d04761d1a160\") " pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:05 crc kubenswrapper[4992]: I0131 09:40:05.282134 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhzvh\" (UniqueName: \"kubernetes.io/projected/14d5da0b-0b92-4836-9bd8-d04761d1a160-kube-api-access-hhzvh\") pod \"certified-operators-zc724\" (UID: \"14d5da0b-0b92-4836-9bd8-d04761d1a160\") " pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:05 crc kubenswrapper[4992]: I0131 09:40:05.361258 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:05 crc kubenswrapper[4992]: I0131 09:40:05.901038 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zc724"] Jan 31 09:40:06 crc kubenswrapper[4992]: I0131 09:40:06.858458 4992 generic.go:334] "Generic (PLEG): container finished" podID="14d5da0b-0b92-4836-9bd8-d04761d1a160" containerID="aa79c3884ff511b143dbebf923cedaa2fc46e949d5bc8d98dc735fb51656a9a1" exitCode=0 Jan 31 09:40:06 crc kubenswrapper[4992]: I0131 09:40:06.858528 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zc724" event={"ID":"14d5da0b-0b92-4836-9bd8-d04761d1a160","Type":"ContainerDied","Data":"aa79c3884ff511b143dbebf923cedaa2fc46e949d5bc8d98dc735fb51656a9a1"} Jan 31 09:40:06 crc kubenswrapper[4992]: I0131 09:40:06.858580 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zc724" event={"ID":"14d5da0b-0b92-4836-9bd8-d04761d1a160","Type":"ContainerStarted","Data":"09b9cbd8aece7d0f62f4d8ab5d50521863e28f099cabcbffcbc42fd90c2d3a80"} Jan 31 09:40:07 crc kubenswrapper[4992]: I0131 09:40:07.867670 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zc724" event={"ID":"14d5da0b-0b92-4836-9bd8-d04761d1a160","Type":"ContainerStarted","Data":"3e2f7045c5320c245ea0ff0037e08a566757fdd4c0f5a36d4329e99514c9ea16"} Jan 31 09:40:08 crc kubenswrapper[4992]: I0131 09:40:08.878041 4992 generic.go:334] "Generic (PLEG): container finished" podID="14d5da0b-0b92-4836-9bd8-d04761d1a160" containerID="3e2f7045c5320c245ea0ff0037e08a566757fdd4c0f5a36d4329e99514c9ea16" exitCode=0 Jan 31 09:40:08 crc kubenswrapper[4992]: I0131 09:40:08.878163 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zc724" event={"ID":"14d5da0b-0b92-4836-9bd8-d04761d1a160","Type":"ContainerDied","Data":"3e2f7045c5320c245ea0ff0037e08a566757fdd4c0f5a36d4329e99514c9ea16"} Jan 31 09:40:09 crc kubenswrapper[4992]: I0131 09:40:09.888111 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zc724" event={"ID":"14d5da0b-0b92-4836-9bd8-d04761d1a160","Type":"ContainerStarted","Data":"d3a2431544f2053032d08e43980fcef035205708de15bd002faab0d32e3fc241"} Jan 31 09:40:09 crc kubenswrapper[4992]: I0131 09:40:09.913527 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zc724" podStartSLOduration=2.516833579 podStartE2EDuration="4.913513031s" podCreationTimestamp="2026-01-31 09:40:05 +0000 UTC" firstStartedPulling="2026-01-31 09:40:06.861281401 +0000 UTC m=+902.832673388" lastFinishedPulling="2026-01-31 09:40:09.257960853 +0000 UTC m=+905.229352840" observedRunningTime="2026-01-31 09:40:09.912971945 +0000 UTC m=+905.884363962" watchObservedRunningTime="2026-01-31 09:40:09.913513031 +0000 UTC m=+905.884905018" Jan 31 09:40:15 crc kubenswrapper[4992]: I0131 09:40:15.363346 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:15 crc kubenswrapper[4992]: I0131 09:40:15.364053 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:15 crc kubenswrapper[4992]: I0131 09:40:15.403709 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:15 crc kubenswrapper[4992]: I0131 09:40:15.981810 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:16 crc kubenswrapper[4992]: I0131 09:40:16.029236 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zc724"] Jan 31 09:40:17 crc kubenswrapper[4992]: I0131 09:40:17.941983 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zc724" podUID="14d5da0b-0b92-4836-9bd8-d04761d1a160" containerName="registry-server" containerID="cri-o://d3a2431544f2053032d08e43980fcef035205708de15bd002faab0d32e3fc241" gracePeriod=2 Jan 31 09:40:18 crc kubenswrapper[4992]: I0131 09:40:18.949995 4992 generic.go:334] "Generic (PLEG): container finished" podID="14d5da0b-0b92-4836-9bd8-d04761d1a160" containerID="d3a2431544f2053032d08e43980fcef035205708de15bd002faab0d32e3fc241" exitCode=0 Jan 31 09:40:18 crc kubenswrapper[4992]: I0131 09:40:18.950026 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zc724" event={"ID":"14d5da0b-0b92-4836-9bd8-d04761d1a160","Type":"ContainerDied","Data":"d3a2431544f2053032d08e43980fcef035205708de15bd002faab0d32e3fc241"} Jan 31 09:40:19 crc kubenswrapper[4992]: I0131 09:40:19.403678 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:19 crc kubenswrapper[4992]: I0131 09:40:19.466591 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14d5da0b-0b92-4836-9bd8-d04761d1a160-catalog-content\") pod \"14d5da0b-0b92-4836-9bd8-d04761d1a160\" (UID: \"14d5da0b-0b92-4836-9bd8-d04761d1a160\") " Jan 31 09:40:19 crc kubenswrapper[4992]: I0131 09:40:19.466716 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhzvh\" (UniqueName: \"kubernetes.io/projected/14d5da0b-0b92-4836-9bd8-d04761d1a160-kube-api-access-hhzvh\") pod \"14d5da0b-0b92-4836-9bd8-d04761d1a160\" (UID: \"14d5da0b-0b92-4836-9bd8-d04761d1a160\") " Jan 31 09:40:19 crc kubenswrapper[4992]: I0131 09:40:19.466752 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14d5da0b-0b92-4836-9bd8-d04761d1a160-utilities\") pod \"14d5da0b-0b92-4836-9bd8-d04761d1a160\" (UID: \"14d5da0b-0b92-4836-9bd8-d04761d1a160\") " Jan 31 09:40:19 crc kubenswrapper[4992]: I0131 09:40:19.467702 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14d5da0b-0b92-4836-9bd8-d04761d1a160-utilities" (OuterVolumeSpecName: "utilities") pod "14d5da0b-0b92-4836-9bd8-d04761d1a160" (UID: "14d5da0b-0b92-4836-9bd8-d04761d1a160"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:40:19 crc kubenswrapper[4992]: I0131 09:40:19.504911 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14d5da0b-0b92-4836-9bd8-d04761d1a160-kube-api-access-hhzvh" (OuterVolumeSpecName: "kube-api-access-hhzvh") pod "14d5da0b-0b92-4836-9bd8-d04761d1a160" (UID: "14d5da0b-0b92-4836-9bd8-d04761d1a160"). InnerVolumeSpecName "kube-api-access-hhzvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:40:19 crc kubenswrapper[4992]: I0131 09:40:19.568626 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhzvh\" (UniqueName: \"kubernetes.io/projected/14d5da0b-0b92-4836-9bd8-d04761d1a160-kube-api-access-hhzvh\") on node \"crc\" DevicePath \"\"" Jan 31 09:40:19 crc kubenswrapper[4992]: I0131 09:40:19.568657 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14d5da0b-0b92-4836-9bd8-d04761d1a160-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:40:19 crc kubenswrapper[4992]: I0131 09:40:19.780170 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14d5da0b-0b92-4836-9bd8-d04761d1a160-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14d5da0b-0b92-4836-9bd8-d04761d1a160" (UID: "14d5da0b-0b92-4836-9bd8-d04761d1a160"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:40:19 crc kubenswrapper[4992]: I0131 09:40:19.871447 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14d5da0b-0b92-4836-9bd8-d04761d1a160-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:40:19 crc kubenswrapper[4992]: I0131 09:40:19.960169 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zc724" event={"ID":"14d5da0b-0b92-4836-9bd8-d04761d1a160","Type":"ContainerDied","Data":"09b9cbd8aece7d0f62f4d8ab5d50521863e28f099cabcbffcbc42fd90c2d3a80"} Jan 31 09:40:19 crc kubenswrapper[4992]: I0131 09:40:19.960262 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zc724" Jan 31 09:40:19 crc kubenswrapper[4992]: I0131 09:40:19.960282 4992 scope.go:117] "RemoveContainer" containerID="d3a2431544f2053032d08e43980fcef035205708de15bd002faab0d32e3fc241" Jan 31 09:40:19 crc kubenswrapper[4992]: I0131 09:40:19.980938 4992 scope.go:117] "RemoveContainer" containerID="3e2f7045c5320c245ea0ff0037e08a566757fdd4c0f5a36d4329e99514c9ea16" Jan 31 09:40:19 crc kubenswrapper[4992]: I0131 09:40:19.999830 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zc724"] Jan 31 09:40:20 crc kubenswrapper[4992]: I0131 09:40:20.004265 4992 scope.go:117] "RemoveContainer" containerID="aa79c3884ff511b143dbebf923cedaa2fc46e949d5bc8d98dc735fb51656a9a1" Jan 31 09:40:20 crc kubenswrapper[4992]: I0131 09:40:20.005174 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zc724"] Jan 31 09:40:21 crc kubenswrapper[4992]: I0131 09:40:21.192382 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14d5da0b-0b92-4836-9bd8-d04761d1a160" path="/var/lib/kubelet/pods/14d5da0b-0b92-4836-9bd8-d04761d1a160/volumes" Jan 31 09:40:21 crc kubenswrapper[4992]: I0131 09:40:21.397028 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7f5cfb8dfd-ks4dr" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.113978 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-khsdm"] Jan 31 09:40:22 crc kubenswrapper[4992]: E0131 09:40:22.114406 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d5da0b-0b92-4836-9bd8-d04761d1a160" containerName="extract-content" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.114496 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d5da0b-0b92-4836-9bd8-d04761d1a160" containerName="extract-content" Jan 31 09:40:22 crc kubenswrapper[4992]: E0131 09:40:22.114549 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d5da0b-0b92-4836-9bd8-d04761d1a160" containerName="extract-utilities" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.114606 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d5da0b-0b92-4836-9bd8-d04761d1a160" containerName="extract-utilities" Jan 31 09:40:22 crc kubenswrapper[4992]: E0131 09:40:22.114666 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d5da0b-0b92-4836-9bd8-d04761d1a160" containerName="registry-server" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.114721 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d5da0b-0b92-4836-9bd8-d04761d1a160" containerName="registry-server" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.114867 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="14d5da0b-0b92-4836-9bd8-d04761d1a160" containerName="registry-server" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.116679 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.119638 4992 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.119703 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.120716 4992 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-j2zh9" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.149213 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh"] Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.150017 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.152124 4992 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.162940 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh"] Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.206194 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/57b24089-856b-4e9c-bdf8-9f0277de0ae4-metrics\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.206308 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/57b24089-856b-4e9c-bdf8-9f0277de0ae4-reloader\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.206361 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/57b24089-856b-4e9c-bdf8-9f0277de0ae4-frr-sockets\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.206386 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57b24089-856b-4e9c-bdf8-9f0277de0ae4-metrics-certs\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.206484 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/57b24089-856b-4e9c-bdf8-9f0277de0ae4-frr-conf\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.206507 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9704b2d7-bfa6-40c7-a00f-bb4022274a73-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-bb8hh\" (UID: \"9704b2d7-bfa6-40c7-a00f-bb4022274a73\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.206530 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/57b24089-856b-4e9c-bdf8-9f0277de0ae4-frr-startup\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.206552 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w87rj\" (UniqueName: \"kubernetes.io/projected/9704b2d7-bfa6-40c7-a00f-bb4022274a73-kube-api-access-w87rj\") pod \"frr-k8s-webhook-server-7df86c4f6c-bb8hh\" (UID: \"9704b2d7-bfa6-40c7-a00f-bb4022274a73\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.206576 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lx6v\" (UniqueName: \"kubernetes.io/projected/57b24089-856b-4e9c-bdf8-9f0277de0ae4-kube-api-access-6lx6v\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.267164 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-7m6xt"] Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.276644 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7m6xt" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.281057 4992 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-qdsc6" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.281339 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.281340 4992 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.282943 4992 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.303959 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-f4hpb"] Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.304980 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-f4hpb" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.308862 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/57b24089-856b-4e9c-bdf8-9f0277de0ae4-metrics\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.308921 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/57b24089-856b-4e9c-bdf8-9f0277de0ae4-reloader\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.308941 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/57b24089-856b-4e9c-bdf8-9f0277de0ae4-frr-sockets\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.308961 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57b24089-856b-4e9c-bdf8-9f0277de0ae4-metrics-certs\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.308989 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/57b24089-856b-4e9c-bdf8-9f0277de0ae4-frr-conf\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.309025 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9704b2d7-bfa6-40c7-a00f-bb4022274a73-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-bb8hh\" (UID: \"9704b2d7-bfa6-40c7-a00f-bb4022274a73\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.309047 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/57b24089-856b-4e9c-bdf8-9f0277de0ae4-frr-startup\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.309066 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w87rj\" (UniqueName: \"kubernetes.io/projected/9704b2d7-bfa6-40c7-a00f-bb4022274a73-kube-api-access-w87rj\") pod \"frr-k8s-webhook-server-7df86c4f6c-bb8hh\" (UID: \"9704b2d7-bfa6-40c7-a00f-bb4022274a73\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.309086 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lx6v\" (UniqueName: \"kubernetes.io/projected/57b24089-856b-4e9c-bdf8-9f0277de0ae4-kube-api-access-6lx6v\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: E0131 09:40:22.309621 4992 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 31 09:40:22 crc kubenswrapper[4992]: E0131 09:40:22.309697 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b24089-856b-4e9c-bdf8-9f0277de0ae4-metrics-certs podName:57b24089-856b-4e9c-bdf8-9f0277de0ae4 nodeName:}" failed. No retries permitted until 2026-01-31 09:40:22.809671363 +0000 UTC m=+918.781063431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/57b24089-856b-4e9c-bdf8-9f0277de0ae4-metrics-certs") pod "frr-k8s-khsdm" (UID: "57b24089-856b-4e9c-bdf8-9f0277de0ae4") : secret "frr-k8s-certs-secret" not found Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.310199 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/57b24089-856b-4e9c-bdf8-9f0277de0ae4-reloader\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.310440 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/57b24089-856b-4e9c-bdf8-9f0277de0ae4-frr-sockets\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.310470 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/57b24089-856b-4e9c-bdf8-9f0277de0ae4-metrics\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: E0131 09:40:22.310561 4992 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 31 09:40:22 crc kubenswrapper[4992]: E0131 09:40:22.310609 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9704b2d7-bfa6-40c7-a00f-bb4022274a73-cert podName:9704b2d7-bfa6-40c7-a00f-bb4022274a73 nodeName:}" failed. No retries permitted until 2026-01-31 09:40:22.81059208 +0000 UTC m=+918.781984127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9704b2d7-bfa6-40c7-a00f-bb4022274a73-cert") pod "frr-k8s-webhook-server-7df86c4f6c-bb8hh" (UID: "9704b2d7-bfa6-40c7-a00f-bb4022274a73") : secret "frr-k8s-webhook-server-cert" not found Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.310759 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/57b24089-856b-4e9c-bdf8-9f0277de0ae4-frr-conf\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.311973 4992 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.312460 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/57b24089-856b-4e9c-bdf8-9f0277de0ae4-frr-startup\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.334990 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-f4hpb"] Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.343714 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w87rj\" (UniqueName: \"kubernetes.io/projected/9704b2d7-bfa6-40c7-a00f-bb4022274a73-kube-api-access-w87rj\") pod \"frr-k8s-webhook-server-7df86c4f6c-bb8hh\" (UID: \"9704b2d7-bfa6-40c7-a00f-bb4022274a73\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.355808 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lx6v\" (UniqueName: \"kubernetes.io/projected/57b24089-856b-4e9c-bdf8-9f0277de0ae4-kube-api-access-6lx6v\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.410508 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5c71871-7487-4d31-8967-2032f4048162-cert\") pod \"controller-6968d8fdc4-f4hpb\" (UID: \"a5c71871-7487-4d31-8967-2032f4048162\") " pod="metallb-system/controller-6968d8fdc4-f4hpb" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.410802 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa9e7b4e-012f-446c-b156-cae8b53ba319-metrics-certs\") pod \"speaker-7m6xt\" (UID: \"fa9e7b4e-012f-446c-b156-cae8b53ba319\") " pod="metallb-system/speaker-7m6xt" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.410845 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fa9e7b4e-012f-446c-b156-cae8b53ba319-memberlist\") pod \"speaker-7m6xt\" (UID: \"fa9e7b4e-012f-446c-b156-cae8b53ba319\") " pod="metallb-system/speaker-7m6xt" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.410886 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfnr2\" (UniqueName: \"kubernetes.io/projected/fa9e7b4e-012f-446c-b156-cae8b53ba319-kube-api-access-mfnr2\") pod \"speaker-7m6xt\" (UID: \"fa9e7b4e-012f-446c-b156-cae8b53ba319\") " pod="metallb-system/speaker-7m6xt" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.410916 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5c71871-7487-4d31-8967-2032f4048162-metrics-certs\") pod \"controller-6968d8fdc4-f4hpb\" (UID: \"a5c71871-7487-4d31-8967-2032f4048162\") " pod="metallb-system/controller-6968d8fdc4-f4hpb" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.410941 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlq97\" (UniqueName: \"kubernetes.io/projected/a5c71871-7487-4d31-8967-2032f4048162-kube-api-access-rlq97\") pod \"controller-6968d8fdc4-f4hpb\" (UID: \"a5c71871-7487-4d31-8967-2032f4048162\") " pod="metallb-system/controller-6968d8fdc4-f4hpb" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.410962 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fa9e7b4e-012f-446c-b156-cae8b53ba319-metallb-excludel2\") pod \"speaker-7m6xt\" (UID: \"fa9e7b4e-012f-446c-b156-cae8b53ba319\") " pod="metallb-system/speaker-7m6xt" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.512154 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fa9e7b4e-012f-446c-b156-cae8b53ba319-memberlist\") pod \"speaker-7m6xt\" (UID: \"fa9e7b4e-012f-446c-b156-cae8b53ba319\") " pod="metallb-system/speaker-7m6xt" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.512260 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfnr2\" (UniqueName: \"kubernetes.io/projected/fa9e7b4e-012f-446c-b156-cae8b53ba319-kube-api-access-mfnr2\") pod \"speaker-7m6xt\" (UID: \"fa9e7b4e-012f-446c-b156-cae8b53ba319\") " pod="metallb-system/speaker-7m6xt" Jan 31 09:40:22 crc kubenswrapper[4992]: E0131 09:40:22.512340 4992 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 31 09:40:22 crc kubenswrapper[4992]: E0131 09:40:22.512434 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa9e7b4e-012f-446c-b156-cae8b53ba319-memberlist podName:fa9e7b4e-012f-446c-b156-cae8b53ba319 nodeName:}" failed. No retries permitted until 2026-01-31 09:40:23.012400463 +0000 UTC m=+918.983792450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/fa9e7b4e-012f-446c-b156-cae8b53ba319-memberlist") pod "speaker-7m6xt" (UID: "fa9e7b4e-012f-446c-b156-cae8b53ba319") : secret "metallb-memberlist" not found Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.512292 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5c71871-7487-4d31-8967-2032f4048162-metrics-certs\") pod \"controller-6968d8fdc4-f4hpb\" (UID: \"a5c71871-7487-4d31-8967-2032f4048162\") " pod="metallb-system/controller-6968d8fdc4-f4hpb" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.512618 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlq97\" (UniqueName: \"kubernetes.io/projected/a5c71871-7487-4d31-8967-2032f4048162-kube-api-access-rlq97\") pod \"controller-6968d8fdc4-f4hpb\" (UID: \"a5c71871-7487-4d31-8967-2032f4048162\") " pod="metallb-system/controller-6968d8fdc4-f4hpb" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.512635 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fa9e7b4e-012f-446c-b156-cae8b53ba319-metallb-excludel2\") pod \"speaker-7m6xt\" (UID: \"fa9e7b4e-012f-446c-b156-cae8b53ba319\") " pod="metallb-system/speaker-7m6xt" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.512694 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5c71871-7487-4d31-8967-2032f4048162-cert\") pod \"controller-6968d8fdc4-f4hpb\" (UID: \"a5c71871-7487-4d31-8967-2032f4048162\") " pod="metallb-system/controller-6968d8fdc4-f4hpb" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.512727 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa9e7b4e-012f-446c-b156-cae8b53ba319-metrics-certs\") pod \"speaker-7m6xt\" (UID: \"fa9e7b4e-012f-446c-b156-cae8b53ba319\") " pod="metallb-system/speaker-7m6xt" Jan 31 09:40:22 crc kubenswrapper[4992]: E0131 09:40:22.512871 4992 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Jan 31 09:40:22 crc kubenswrapper[4992]: E0131 09:40:22.512923 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa9e7b4e-012f-446c-b156-cae8b53ba319-metrics-certs podName:fa9e7b4e-012f-446c-b156-cae8b53ba319 nodeName:}" failed. No retries permitted until 2026-01-31 09:40:23.012907368 +0000 UTC m=+918.984299355 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa9e7b4e-012f-446c-b156-cae8b53ba319-metrics-certs") pod "speaker-7m6xt" (UID: "fa9e7b4e-012f-446c-b156-cae8b53ba319") : secret "speaker-certs-secret" not found Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.513512 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fa9e7b4e-012f-446c-b156-cae8b53ba319-metallb-excludel2\") pod \"speaker-7m6xt\" (UID: \"fa9e7b4e-012f-446c-b156-cae8b53ba319\") " pod="metallb-system/speaker-7m6xt" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.514253 4992 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.515938 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5c71871-7487-4d31-8967-2032f4048162-metrics-certs\") pod \"controller-6968d8fdc4-f4hpb\" (UID: \"a5c71871-7487-4d31-8967-2032f4048162\") " pod="metallb-system/controller-6968d8fdc4-f4hpb" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.533863 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfnr2\" (UniqueName: \"kubernetes.io/projected/fa9e7b4e-012f-446c-b156-cae8b53ba319-kube-api-access-mfnr2\") pod \"speaker-7m6xt\" (UID: \"fa9e7b4e-012f-446c-b156-cae8b53ba319\") " pod="metallb-system/speaker-7m6xt" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.534159 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlq97\" (UniqueName: \"kubernetes.io/projected/a5c71871-7487-4d31-8967-2032f4048162-kube-api-access-rlq97\") pod \"controller-6968d8fdc4-f4hpb\" (UID: \"a5c71871-7487-4d31-8967-2032f4048162\") " pod="metallb-system/controller-6968d8fdc4-f4hpb" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.534345 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5c71871-7487-4d31-8967-2032f4048162-cert\") pod \"controller-6968d8fdc4-f4hpb\" (UID: \"a5c71871-7487-4d31-8967-2032f4048162\") " pod="metallb-system/controller-6968d8fdc4-f4hpb" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.629405 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-f4hpb" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.816806 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57b24089-856b-4e9c-bdf8-9f0277de0ae4-metrics-certs\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.816877 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9704b2d7-bfa6-40c7-a00f-bb4022274a73-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-bb8hh\" (UID: \"9704b2d7-bfa6-40c7-a00f-bb4022274a73\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.822071 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57b24089-856b-4e9c-bdf8-9f0277de0ae4-metrics-certs\") pod \"frr-k8s-khsdm\" (UID: \"57b24089-856b-4e9c-bdf8-9f0277de0ae4\") " pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.823084 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9704b2d7-bfa6-40c7-a00f-bb4022274a73-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-bb8hh\" (UID: \"9704b2d7-bfa6-40c7-a00f-bb4022274a73\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh" Jan 31 09:40:22 crc kubenswrapper[4992]: I0131 09:40:22.862890 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-f4hpb"] Jan 31 09:40:23 crc kubenswrapper[4992]: I0131 09:40:23.017043 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-f4hpb" event={"ID":"a5c71871-7487-4d31-8967-2032f4048162","Type":"ContainerStarted","Data":"81db0fda0321f500d3a3ef15d537aff88feab1fd5b7a0acb51076536d6838da0"} Jan 31 09:40:23 crc kubenswrapper[4992]: I0131 09:40:23.019835 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa9e7b4e-012f-446c-b156-cae8b53ba319-metrics-certs\") pod \"speaker-7m6xt\" (UID: \"fa9e7b4e-012f-446c-b156-cae8b53ba319\") " pod="metallb-system/speaker-7m6xt" Jan 31 09:40:23 crc kubenswrapper[4992]: I0131 09:40:23.019896 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fa9e7b4e-012f-446c-b156-cae8b53ba319-memberlist\") pod \"speaker-7m6xt\" (UID: \"fa9e7b4e-012f-446c-b156-cae8b53ba319\") " pod="metallb-system/speaker-7m6xt" Jan 31 09:40:23 crc kubenswrapper[4992]: E0131 09:40:23.020045 4992 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 31 09:40:23 crc kubenswrapper[4992]: E0131 09:40:23.020108 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa9e7b4e-012f-446c-b156-cae8b53ba319-memberlist podName:fa9e7b4e-012f-446c-b156-cae8b53ba319 nodeName:}" failed. No retries permitted until 2026-01-31 09:40:24.020092717 +0000 UTC m=+919.991484704 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/fa9e7b4e-012f-446c-b156-cae8b53ba319-memberlist") pod "speaker-7m6xt" (UID: "fa9e7b4e-012f-446c-b156-cae8b53ba319") : secret "metallb-memberlist" not found Jan 31 09:40:23 crc kubenswrapper[4992]: I0131 09:40:23.023624 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa9e7b4e-012f-446c-b156-cae8b53ba319-metrics-certs\") pod \"speaker-7m6xt\" (UID: \"fa9e7b4e-012f-446c-b156-cae8b53ba319\") " pod="metallb-system/speaker-7m6xt" Jan 31 09:40:23 crc kubenswrapper[4992]: I0131 09:40:23.032558 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:23 crc kubenswrapper[4992]: I0131 09:40:23.089888 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh" Jan 31 09:40:23 crc kubenswrapper[4992]: I0131 09:40:23.285258 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh"] Jan 31 09:40:23 crc kubenswrapper[4992]: W0131 09:40:23.295987 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9704b2d7_bfa6_40c7_a00f_bb4022274a73.slice/crio-219509b10b1ca96b027c8313312fd0eea22e19124f94aeabe39eef4e8d91608b WatchSource:0}: Error finding container 219509b10b1ca96b027c8313312fd0eea22e19124f94aeabe39eef4e8d91608b: Status 404 returned error can't find the container with id 219509b10b1ca96b027c8313312fd0eea22e19124f94aeabe39eef4e8d91608b Jan 31 09:40:24 crc kubenswrapper[4992]: I0131 09:40:24.023730 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh" event={"ID":"9704b2d7-bfa6-40c7-a00f-bb4022274a73","Type":"ContainerStarted","Data":"219509b10b1ca96b027c8313312fd0eea22e19124f94aeabe39eef4e8d91608b"} Jan 31 09:40:24 crc kubenswrapper[4992]: I0131 09:40:24.025529 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-f4hpb" event={"ID":"a5c71871-7487-4d31-8967-2032f4048162","Type":"ContainerStarted","Data":"ef04aaca090b478c02643d9a64b30811766f3c21f860e0e63c176bc76bf977ea"} Jan 31 09:40:24 crc kubenswrapper[4992]: I0131 09:40:24.025566 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-f4hpb" event={"ID":"a5c71871-7487-4d31-8967-2032f4048162","Type":"ContainerStarted","Data":"2b5be9daffaf008a35867260a3fc4df33d4e6d23d7526a10f1d547dfaec436a4"} Jan 31 09:40:24 crc kubenswrapper[4992]: I0131 09:40:24.025667 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-f4hpb" Jan 31 09:40:24 crc kubenswrapper[4992]: I0131 09:40:24.026735 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-khsdm" event={"ID":"57b24089-856b-4e9c-bdf8-9f0277de0ae4","Type":"ContainerStarted","Data":"71960323771b85ca263e37470a862419f336685464ca31635b81a8cd5c021ca4"} Jan 31 09:40:24 crc kubenswrapper[4992]: I0131 09:40:24.032892 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fa9e7b4e-012f-446c-b156-cae8b53ba319-memberlist\") pod \"speaker-7m6xt\" (UID: \"fa9e7b4e-012f-446c-b156-cae8b53ba319\") " pod="metallb-system/speaker-7m6xt" Jan 31 09:40:24 crc kubenswrapper[4992]: I0131 09:40:24.037969 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fa9e7b4e-012f-446c-b156-cae8b53ba319-memberlist\") pod \"speaker-7m6xt\" (UID: \"fa9e7b4e-012f-446c-b156-cae8b53ba319\") " pod="metallb-system/speaker-7m6xt" Jan 31 09:40:24 crc kubenswrapper[4992]: I0131 09:40:24.062837 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-f4hpb" podStartSLOduration=2.062820646 podStartE2EDuration="2.062820646s" podCreationTimestamp="2026-01-31 09:40:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:40:24.05776107 +0000 UTC m=+920.029153067" watchObservedRunningTime="2026-01-31 09:40:24.062820646 +0000 UTC m=+920.034212633" Jan 31 09:40:24 crc kubenswrapper[4992]: I0131 09:40:24.107099 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7m6xt" Jan 31 09:40:25 crc kubenswrapper[4992]: I0131 09:40:25.036701 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7m6xt" event={"ID":"fa9e7b4e-012f-446c-b156-cae8b53ba319","Type":"ContainerStarted","Data":"5ffc8dffc4bef28a83e237ec224a6ebe185cec88ff3b76d35baa815d13f3d19c"} Jan 31 09:40:25 crc kubenswrapper[4992]: I0131 09:40:25.037035 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7m6xt" event={"ID":"fa9e7b4e-012f-446c-b156-cae8b53ba319","Type":"ContainerStarted","Data":"deccf1ae378ebe071689be869c4776704c59f4c743df29b25a7b822795a82f67"} Jan 31 09:40:25 crc kubenswrapper[4992]: I0131 09:40:25.037049 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7m6xt" event={"ID":"fa9e7b4e-012f-446c-b156-cae8b53ba319","Type":"ContainerStarted","Data":"3ae79d808421b26664f2bbbca369d11efc7121b1ca3adb32903470c614808313"} Jan 31 09:40:25 crc kubenswrapper[4992]: I0131 09:40:25.037225 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-7m6xt" Jan 31 09:40:25 crc kubenswrapper[4992]: I0131 09:40:25.058702 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-7m6xt" podStartSLOduration=3.05867817 podStartE2EDuration="3.05867817s" podCreationTimestamp="2026-01-31 09:40:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:40:25.0541776 +0000 UTC m=+921.025569607" watchObservedRunningTime="2026-01-31 09:40:25.05867817 +0000 UTC m=+921.030070167" Jan 31 09:40:31 crc kubenswrapper[4992]: I0131 09:40:31.073462 4992 generic.go:334] "Generic (PLEG): container finished" podID="57b24089-856b-4e9c-bdf8-9f0277de0ae4" containerID="04690c8d955f17964499ec5b1c3e4a8ac962701680c95c15dc62080594e814b5" exitCode=0 Jan 31 09:40:31 crc kubenswrapper[4992]: I0131 09:40:31.073649 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-khsdm" event={"ID":"57b24089-856b-4e9c-bdf8-9f0277de0ae4","Type":"ContainerDied","Data":"04690c8d955f17964499ec5b1c3e4a8ac962701680c95c15dc62080594e814b5"} Jan 31 09:40:31 crc kubenswrapper[4992]: I0131 09:40:31.075740 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh" event={"ID":"9704b2d7-bfa6-40c7-a00f-bb4022274a73","Type":"ContainerStarted","Data":"388185f18ecde47f2679c11270bab3f2c2683aa87db722aedc8862a6dc351332"} Jan 31 09:40:31 crc kubenswrapper[4992]: I0131 09:40:31.075915 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh" Jan 31 09:40:31 crc kubenswrapper[4992]: I0131 09:40:31.113891 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh" podStartSLOduration=1.91247392 podStartE2EDuration="9.113868916s" podCreationTimestamp="2026-01-31 09:40:22 +0000 UTC" firstStartedPulling="2026-01-31 09:40:23.301949234 +0000 UTC m=+919.273341221" lastFinishedPulling="2026-01-31 09:40:30.50334423 +0000 UTC m=+926.474736217" observedRunningTime="2026-01-31 09:40:31.10674094 +0000 UTC m=+927.078132967" watchObservedRunningTime="2026-01-31 09:40:31.113868916 +0000 UTC m=+927.085260923" Jan 31 09:40:32 crc kubenswrapper[4992]: I0131 09:40:32.084006 4992 generic.go:334] "Generic (PLEG): container finished" podID="57b24089-856b-4e9c-bdf8-9f0277de0ae4" containerID="e225de549bdf439cc50d02d1ff2a1dbdab71b5bdf53d29940b12e1ff8d5259d8" exitCode=0 Jan 31 09:40:32 crc kubenswrapper[4992]: I0131 09:40:32.084531 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-khsdm" event={"ID":"57b24089-856b-4e9c-bdf8-9f0277de0ae4","Type":"ContainerDied","Data":"e225de549bdf439cc50d02d1ff2a1dbdab71b5bdf53d29940b12e1ff8d5259d8"} Jan 31 09:40:33 crc kubenswrapper[4992]: I0131 09:40:33.093377 4992 generic.go:334] "Generic (PLEG): container finished" podID="57b24089-856b-4e9c-bdf8-9f0277de0ae4" containerID="3c42b772c6e7a05663fbfb69771520b2c4d84c7834e94c3a7ad9ddd97a4926ad" exitCode=0 Jan 31 09:40:33 crc kubenswrapper[4992]: I0131 09:40:33.093462 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-khsdm" event={"ID":"57b24089-856b-4e9c-bdf8-9f0277de0ae4","Type":"ContainerDied","Data":"3c42b772c6e7a05663fbfb69771520b2c4d84c7834e94c3a7ad9ddd97a4926ad"} Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.092063 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-79cnm"] Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.093721 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.102350 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-79cnm"] Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.111162 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-7m6xt" Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.166546 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-khsdm" event={"ID":"57b24089-856b-4e9c-bdf8-9f0277de0ae4","Type":"ContainerStarted","Data":"79f064f588b174268395f71f7b7c1d448922aeafd75060f5853f4fdf0fe9d4ad"} Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.166801 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-khsdm" event={"ID":"57b24089-856b-4e9c-bdf8-9f0277de0ae4","Type":"ContainerStarted","Data":"717cc270c41a550cfe3e701a00acd6dcbdd95eb63210ea7b88f2d07a3a7e487e"} Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.166872 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-khsdm" event={"ID":"57b24089-856b-4e9c-bdf8-9f0277de0ae4","Type":"ContainerStarted","Data":"be6e26314c6856702f56fd58d09a894f347b693c091a90664e6e869f0e1e405e"} Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.166930 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-khsdm" event={"ID":"57b24089-856b-4e9c-bdf8-9f0277de0ae4","Type":"ContainerStarted","Data":"e5c299b399f84b5285ac640f488646746bacdfff73d5c2e6b537bf4b073d268a"} Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.167094 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-khsdm" event={"ID":"57b24089-856b-4e9c-bdf8-9f0277de0ae4","Type":"ContainerStarted","Data":"28d49de0417b1eaef2431df3579d7708fd5e4b26dde3b57706e118e0555fba30"} Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.179643 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdcfaec8-5faa-4311-8218-274d7d587693-catalog-content\") pod \"redhat-marketplace-79cnm\" (UID: \"bdcfaec8-5faa-4311-8218-274d7d587693\") " pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.180755 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdcfaec8-5faa-4311-8218-274d7d587693-utilities\") pod \"redhat-marketplace-79cnm\" (UID: \"bdcfaec8-5faa-4311-8218-274d7d587693\") " pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.180820 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpx25\" (UniqueName: \"kubernetes.io/projected/bdcfaec8-5faa-4311-8218-274d7d587693-kube-api-access-fpx25\") pod \"redhat-marketplace-79cnm\" (UID: \"bdcfaec8-5faa-4311-8218-274d7d587693\") " pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.282510 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdcfaec8-5faa-4311-8218-274d7d587693-utilities\") pod \"redhat-marketplace-79cnm\" (UID: \"bdcfaec8-5faa-4311-8218-274d7d587693\") " pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.282582 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpx25\" (UniqueName: \"kubernetes.io/projected/bdcfaec8-5faa-4311-8218-274d7d587693-kube-api-access-fpx25\") pod \"redhat-marketplace-79cnm\" (UID: \"bdcfaec8-5faa-4311-8218-274d7d587693\") " pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.282686 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdcfaec8-5faa-4311-8218-274d7d587693-catalog-content\") pod \"redhat-marketplace-79cnm\" (UID: \"bdcfaec8-5faa-4311-8218-274d7d587693\") " pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.283368 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdcfaec8-5faa-4311-8218-274d7d587693-utilities\") pod \"redhat-marketplace-79cnm\" (UID: \"bdcfaec8-5faa-4311-8218-274d7d587693\") " pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.283399 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdcfaec8-5faa-4311-8218-274d7d587693-catalog-content\") pod \"redhat-marketplace-79cnm\" (UID: \"bdcfaec8-5faa-4311-8218-274d7d587693\") " pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.300714 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpx25\" (UniqueName: \"kubernetes.io/projected/bdcfaec8-5faa-4311-8218-274d7d587693-kube-api-access-fpx25\") pod \"redhat-marketplace-79cnm\" (UID: \"bdcfaec8-5faa-4311-8218-274d7d587693\") " pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.415002 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:34 crc kubenswrapper[4992]: I0131 09:40:34.713576 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-79cnm"] Jan 31 09:40:34 crc kubenswrapper[4992]: W0131 09:40:34.725888 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdcfaec8_5faa_4311_8218_274d7d587693.slice/crio-b366681c6c13ee18c9920d18ffde090fd68d04dfef71f540cd46c8b6bc265599 WatchSource:0}: Error finding container b366681c6c13ee18c9920d18ffde090fd68d04dfef71f540cd46c8b6bc265599: Status 404 returned error can't find the container with id b366681c6c13ee18c9920d18ffde090fd68d04dfef71f540cd46c8b6bc265599 Jan 31 09:40:35 crc kubenswrapper[4992]: I0131 09:40:35.174893 4992 generic.go:334] "Generic (PLEG): container finished" podID="bdcfaec8-5faa-4311-8218-274d7d587693" containerID="16a98e9d5f4400c7455abc1c76073e9a9b39d7b0e7be325a3bd8ecc0d2f4c9f0" exitCode=0 Jan 31 09:40:35 crc kubenswrapper[4992]: I0131 09:40:35.174954 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79cnm" event={"ID":"bdcfaec8-5faa-4311-8218-274d7d587693","Type":"ContainerDied","Data":"16a98e9d5f4400c7455abc1c76073e9a9b39d7b0e7be325a3bd8ecc0d2f4c9f0"} Jan 31 09:40:35 crc kubenswrapper[4992]: I0131 09:40:35.175012 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79cnm" event={"ID":"bdcfaec8-5faa-4311-8218-274d7d587693","Type":"ContainerStarted","Data":"b366681c6c13ee18c9920d18ffde090fd68d04dfef71f540cd46c8b6bc265599"} Jan 31 09:40:35 crc kubenswrapper[4992]: I0131 09:40:35.181854 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-khsdm" event={"ID":"57b24089-856b-4e9c-bdf8-9f0277de0ae4","Type":"ContainerStarted","Data":"e0a032d18349f7f6558f35ad7bee92dfaf35e910731ff6ed1bac1d58bd5bff2d"} Jan 31 09:40:35 crc kubenswrapper[4992]: I0131 09:40:35.182201 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:35 crc kubenswrapper[4992]: I0131 09:40:35.223837 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-khsdm" podStartSLOduration=5.976026501 podStartE2EDuration="13.223818618s" podCreationTimestamp="2026-01-31 09:40:22 +0000 UTC" firstStartedPulling="2026-01-31 09:40:23.237701577 +0000 UTC m=+919.209093564" lastFinishedPulling="2026-01-31 09:40:30.485493694 +0000 UTC m=+926.456885681" observedRunningTime="2026-01-31 09:40:35.22285777 +0000 UTC m=+931.194249777" watchObservedRunningTime="2026-01-31 09:40:35.223818618 +0000 UTC m=+931.195210605" Jan 31 09:40:36 crc kubenswrapper[4992]: I0131 09:40:36.197192 4992 generic.go:334] "Generic (PLEG): container finished" podID="bdcfaec8-5faa-4311-8218-274d7d587693" containerID="8753fa560564bfd1040977e230bbe2518a47ce41d74b99b3cb19a0419fd04737" exitCode=0 Jan 31 09:40:36 crc kubenswrapper[4992]: I0131 09:40:36.197236 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79cnm" event={"ID":"bdcfaec8-5faa-4311-8218-274d7d587693","Type":"ContainerDied","Data":"8753fa560564bfd1040977e230bbe2518a47ce41d74b99b3cb19a0419fd04737"} Jan 31 09:40:36 crc kubenswrapper[4992]: I0131 09:40:36.891717 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ncgdf"] Jan 31 09:40:36 crc kubenswrapper[4992]: I0131 09:40:36.892830 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:36 crc kubenswrapper[4992]: I0131 09:40:36.969681 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ncgdf"] Jan 31 09:40:37 crc kubenswrapper[4992]: I0131 09:40:37.014954 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78244768-db4f-451c-8e41-54332e43bc39-utilities\") pod \"community-operators-ncgdf\" (UID: \"78244768-db4f-451c-8e41-54332e43bc39\") " pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:37 crc kubenswrapper[4992]: I0131 09:40:37.015017 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89sc4\" (UniqueName: \"kubernetes.io/projected/78244768-db4f-451c-8e41-54332e43bc39-kube-api-access-89sc4\") pod \"community-operators-ncgdf\" (UID: \"78244768-db4f-451c-8e41-54332e43bc39\") " pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:37 crc kubenswrapper[4992]: I0131 09:40:37.015079 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78244768-db4f-451c-8e41-54332e43bc39-catalog-content\") pod \"community-operators-ncgdf\" (UID: \"78244768-db4f-451c-8e41-54332e43bc39\") " pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:37 crc kubenswrapper[4992]: I0131 09:40:37.116734 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78244768-db4f-451c-8e41-54332e43bc39-utilities\") pod \"community-operators-ncgdf\" (UID: \"78244768-db4f-451c-8e41-54332e43bc39\") " pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:37 crc kubenswrapper[4992]: I0131 09:40:37.116793 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89sc4\" (UniqueName: \"kubernetes.io/projected/78244768-db4f-451c-8e41-54332e43bc39-kube-api-access-89sc4\") pod \"community-operators-ncgdf\" (UID: \"78244768-db4f-451c-8e41-54332e43bc39\") " pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:37 crc kubenswrapper[4992]: I0131 09:40:37.116867 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78244768-db4f-451c-8e41-54332e43bc39-catalog-content\") pod \"community-operators-ncgdf\" (UID: \"78244768-db4f-451c-8e41-54332e43bc39\") " pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:37 crc kubenswrapper[4992]: I0131 09:40:37.117633 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78244768-db4f-451c-8e41-54332e43bc39-utilities\") pod \"community-operators-ncgdf\" (UID: \"78244768-db4f-451c-8e41-54332e43bc39\") " pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:37 crc kubenswrapper[4992]: I0131 09:40:37.117683 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78244768-db4f-451c-8e41-54332e43bc39-catalog-content\") pod \"community-operators-ncgdf\" (UID: \"78244768-db4f-451c-8e41-54332e43bc39\") " pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:37 crc kubenswrapper[4992]: I0131 09:40:37.138241 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89sc4\" (UniqueName: \"kubernetes.io/projected/78244768-db4f-451c-8e41-54332e43bc39-kube-api-access-89sc4\") pod \"community-operators-ncgdf\" (UID: \"78244768-db4f-451c-8e41-54332e43bc39\") " pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:37 crc kubenswrapper[4992]: I0131 09:40:37.205022 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79cnm" event={"ID":"bdcfaec8-5faa-4311-8218-274d7d587693","Type":"ContainerStarted","Data":"fc6fa6ee9c4624131eeffde0caca49d8c7c70b85d264dda55b291c8c51c5e369"} Jan 31 09:40:37 crc kubenswrapper[4992]: I0131 09:40:37.208866 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:37 crc kubenswrapper[4992]: I0131 09:40:37.697616 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-79cnm" podStartSLOduration=2.2995020999999998 podStartE2EDuration="3.697597789s" podCreationTimestamp="2026-01-31 09:40:34 +0000 UTC" firstStartedPulling="2026-01-31 09:40:35.177110348 +0000 UTC m=+931.148502335" lastFinishedPulling="2026-01-31 09:40:36.575206037 +0000 UTC m=+932.546598024" observedRunningTime="2026-01-31 09:40:37.222724683 +0000 UTC m=+933.194116690" watchObservedRunningTime="2026-01-31 09:40:37.697597789 +0000 UTC m=+933.668989776" Jan 31 09:40:37 crc kubenswrapper[4992]: I0131 09:40:37.703410 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ncgdf"] Jan 31 09:40:38 crc kubenswrapper[4992]: I0131 09:40:38.033154 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:38 crc kubenswrapper[4992]: I0131 09:40:38.092903 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:38 crc kubenswrapper[4992]: I0131 09:40:38.212003 4992 generic.go:334] "Generic (PLEG): container finished" podID="78244768-db4f-451c-8e41-54332e43bc39" containerID="ffec450b70f6ed1ed772f19e5e75c1475868ec884376132541b75af63aacd9bd" exitCode=0 Jan 31 09:40:38 crc kubenswrapper[4992]: I0131 09:40:38.212078 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncgdf" event={"ID":"78244768-db4f-451c-8e41-54332e43bc39","Type":"ContainerDied","Data":"ffec450b70f6ed1ed772f19e5e75c1475868ec884376132541b75af63aacd9bd"} Jan 31 09:40:38 crc kubenswrapper[4992]: I0131 09:40:38.212112 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncgdf" event={"ID":"78244768-db4f-451c-8e41-54332e43bc39","Type":"ContainerStarted","Data":"bc375985076219752f3724fa3621b080088abc47f87f9e3d49a38d15cc8b8631"} Jan 31 09:40:40 crc kubenswrapper[4992]: I0131 09:40:40.231988 4992 generic.go:334] "Generic (PLEG): container finished" podID="78244768-db4f-451c-8e41-54332e43bc39" containerID="59df4885d5889ba8dcc5e5d8c4161d2666ffb5fe5479a119ebfc87fa4f99cc1d" exitCode=0 Jan 31 09:40:40 crc kubenswrapper[4992]: I0131 09:40:40.232089 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncgdf" event={"ID":"78244768-db4f-451c-8e41-54332e43bc39","Type":"ContainerDied","Data":"59df4885d5889ba8dcc5e5d8c4161d2666ffb5fe5479a119ebfc87fa4f99cc1d"} Jan 31 09:40:41 crc kubenswrapper[4992]: I0131 09:40:41.240929 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncgdf" event={"ID":"78244768-db4f-451c-8e41-54332e43bc39","Type":"ContainerStarted","Data":"b8c82a024e2b9015eda05ec6eb058ba7fb602ced2b1d5559fc8560d2fe9453b6"} Jan 31 09:40:41 crc kubenswrapper[4992]: I0131 09:40:41.268353 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ncgdf" podStartSLOduration=2.619209567 podStartE2EDuration="5.268327785s" podCreationTimestamp="2026-01-31 09:40:36 +0000 UTC" firstStartedPulling="2026-01-31 09:40:38.213457289 +0000 UTC m=+934.184849276" lastFinishedPulling="2026-01-31 09:40:40.862575507 +0000 UTC m=+936.833967494" observedRunningTime="2026-01-31 09:40:41.261144058 +0000 UTC m=+937.232536045" watchObservedRunningTime="2026-01-31 09:40:41.268327785 +0000 UTC m=+937.239719792" Jan 31 09:40:42 crc kubenswrapper[4992]: I0131 09:40:42.634171 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-f4hpb" Jan 31 09:40:42 crc kubenswrapper[4992]: I0131 09:40:42.678572 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jntmw"] Jan 31 09:40:42 crc kubenswrapper[4992]: I0131 09:40:42.679292 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jntmw" Jan 31 09:40:42 crc kubenswrapper[4992]: I0131 09:40:42.683164 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 31 09:40:42 crc kubenswrapper[4992]: I0131 09:40:42.684436 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 31 09:40:42 crc kubenswrapper[4992]: I0131 09:40:42.684517 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-gvjk5" Jan 31 09:40:42 crc kubenswrapper[4992]: I0131 09:40:42.689154 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jntmw"] Jan 31 09:40:42 crc kubenswrapper[4992]: I0131 09:40:42.799585 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkj69\" (UniqueName: \"kubernetes.io/projected/980f4141-932f-43e9-9519-d1371656816e-kube-api-access-hkj69\") pod \"openstack-operator-index-jntmw\" (UID: \"980f4141-932f-43e9-9519-d1371656816e\") " pod="openstack-operators/openstack-operator-index-jntmw" Jan 31 09:40:42 crc kubenswrapper[4992]: I0131 09:40:42.901222 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkj69\" (UniqueName: \"kubernetes.io/projected/980f4141-932f-43e9-9519-d1371656816e-kube-api-access-hkj69\") pod \"openstack-operator-index-jntmw\" (UID: \"980f4141-932f-43e9-9519-d1371656816e\") " pod="openstack-operators/openstack-operator-index-jntmw" Jan 31 09:40:42 crc kubenswrapper[4992]: I0131 09:40:42.924856 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkj69\" (UniqueName: \"kubernetes.io/projected/980f4141-932f-43e9-9519-d1371656816e-kube-api-access-hkj69\") pod \"openstack-operator-index-jntmw\" (UID: \"980f4141-932f-43e9-9519-d1371656816e\") " pod="openstack-operators/openstack-operator-index-jntmw" Jan 31 09:40:42 crc kubenswrapper[4992]: I0131 09:40:42.995836 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jntmw" Jan 31 09:40:43 crc kubenswrapper[4992]: I0131 09:40:43.054883 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-khsdm" Jan 31 09:40:43 crc kubenswrapper[4992]: I0131 09:40:43.109342 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-bb8hh" Jan 31 09:40:43 crc kubenswrapper[4992]: I0131 09:40:43.427697 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jntmw"] Jan 31 09:40:44 crc kubenswrapper[4992]: I0131 09:40:44.259904 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jntmw" event={"ID":"980f4141-932f-43e9-9519-d1371656816e","Type":"ContainerStarted","Data":"c677addc661116529f70aebde614414335b705f8aa4e8e21d3bdb28ec7d81103"} Jan 31 09:40:44 crc kubenswrapper[4992]: I0131 09:40:44.415195 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:44 crc kubenswrapper[4992]: I0131 09:40:44.415262 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:44 crc kubenswrapper[4992]: I0131 09:40:44.476110 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:45 crc kubenswrapper[4992]: I0131 09:40:45.331921 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:47 crc kubenswrapper[4992]: I0131 09:40:47.209727 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:47 crc kubenswrapper[4992]: I0131 09:40:47.210467 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:47 crc kubenswrapper[4992]: I0131 09:40:47.247190 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:47 crc kubenswrapper[4992]: I0131 09:40:47.268295 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-79cnm"] Jan 31 09:40:47 crc kubenswrapper[4992]: I0131 09:40:47.287274 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-79cnm" podUID="bdcfaec8-5faa-4311-8218-274d7d587693" containerName="registry-server" containerID="cri-o://fc6fa6ee9c4624131eeffde0caca49d8c7c70b85d264dda55b291c8c51c5e369" gracePeriod=2 Jan 31 09:40:47 crc kubenswrapper[4992]: I0131 09:40:47.327873 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.171967 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.275668 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpx25\" (UniqueName: \"kubernetes.io/projected/bdcfaec8-5faa-4311-8218-274d7d587693-kube-api-access-fpx25\") pod \"bdcfaec8-5faa-4311-8218-274d7d587693\" (UID: \"bdcfaec8-5faa-4311-8218-274d7d587693\") " Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.275759 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdcfaec8-5faa-4311-8218-274d7d587693-utilities\") pod \"bdcfaec8-5faa-4311-8218-274d7d587693\" (UID: \"bdcfaec8-5faa-4311-8218-274d7d587693\") " Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.275813 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdcfaec8-5faa-4311-8218-274d7d587693-catalog-content\") pod \"bdcfaec8-5faa-4311-8218-274d7d587693\" (UID: \"bdcfaec8-5faa-4311-8218-274d7d587693\") " Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.277474 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdcfaec8-5faa-4311-8218-274d7d587693-utilities" (OuterVolumeSpecName: "utilities") pod "bdcfaec8-5faa-4311-8218-274d7d587693" (UID: "bdcfaec8-5faa-4311-8218-274d7d587693"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.281589 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdcfaec8-5faa-4311-8218-274d7d587693-kube-api-access-fpx25" (OuterVolumeSpecName: "kube-api-access-fpx25") pod "bdcfaec8-5faa-4311-8218-274d7d587693" (UID: "bdcfaec8-5faa-4311-8218-274d7d587693"). InnerVolumeSpecName "kube-api-access-fpx25". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.294284 4992 generic.go:334] "Generic (PLEG): container finished" podID="bdcfaec8-5faa-4311-8218-274d7d587693" containerID="fc6fa6ee9c4624131eeffde0caca49d8c7c70b85d264dda55b291c8c51c5e369" exitCode=0 Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.294332 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79cnm" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.294383 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79cnm" event={"ID":"bdcfaec8-5faa-4311-8218-274d7d587693","Type":"ContainerDied","Data":"fc6fa6ee9c4624131eeffde0caca49d8c7c70b85d264dda55b291c8c51c5e369"} Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.294465 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79cnm" event={"ID":"bdcfaec8-5faa-4311-8218-274d7d587693","Type":"ContainerDied","Data":"b366681c6c13ee18c9920d18ffde090fd68d04dfef71f540cd46c8b6bc265599"} Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.294486 4992 scope.go:117] "RemoveContainer" containerID="fc6fa6ee9c4624131eeffde0caca49d8c7c70b85d264dda55b291c8c51c5e369" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.296010 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jntmw" event={"ID":"980f4141-932f-43e9-9519-d1371656816e","Type":"ContainerStarted","Data":"b1c4504e6c2fdf44e87f8695419e0fb335adf8be075555f2a7519210cdeea8e7"} Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.301841 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdcfaec8-5faa-4311-8218-274d7d587693-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdcfaec8-5faa-4311-8218-274d7d587693" (UID: "bdcfaec8-5faa-4311-8218-274d7d587693"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.315867 4992 scope.go:117] "RemoveContainer" containerID="8753fa560564bfd1040977e230bbe2518a47ce41d74b99b3cb19a0419fd04737" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.316142 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jntmw" podStartSLOduration=1.7993442910000002 podStartE2EDuration="6.316122472s" podCreationTimestamp="2026-01-31 09:40:42 +0000 UTC" firstStartedPulling="2026-01-31 09:40:43.440686574 +0000 UTC m=+939.412078561" lastFinishedPulling="2026-01-31 09:40:47.957464755 +0000 UTC m=+943.928856742" observedRunningTime="2026-01-31 09:40:48.313910558 +0000 UTC m=+944.285302545" watchObservedRunningTime="2026-01-31 09:40:48.316122472 +0000 UTC m=+944.287514459" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.330354 4992 scope.go:117] "RemoveContainer" containerID="16a98e9d5f4400c7455abc1c76073e9a9b39d7b0e7be325a3bd8ecc0d2f4c9f0" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.342377 4992 scope.go:117] "RemoveContainer" containerID="fc6fa6ee9c4624131eeffde0caca49d8c7c70b85d264dda55b291c8c51c5e369" Jan 31 09:40:48 crc kubenswrapper[4992]: E0131 09:40:48.342935 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc6fa6ee9c4624131eeffde0caca49d8c7c70b85d264dda55b291c8c51c5e369\": container with ID starting with fc6fa6ee9c4624131eeffde0caca49d8c7c70b85d264dda55b291c8c51c5e369 not found: ID does not exist" containerID="fc6fa6ee9c4624131eeffde0caca49d8c7c70b85d264dda55b291c8c51c5e369" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.342963 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6fa6ee9c4624131eeffde0caca49d8c7c70b85d264dda55b291c8c51c5e369"} err="failed to get container status \"fc6fa6ee9c4624131eeffde0caca49d8c7c70b85d264dda55b291c8c51c5e369\": rpc error: code = NotFound desc = could not find container \"fc6fa6ee9c4624131eeffde0caca49d8c7c70b85d264dda55b291c8c51c5e369\": container with ID starting with fc6fa6ee9c4624131eeffde0caca49d8c7c70b85d264dda55b291c8c51c5e369 not found: ID does not exist" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.342985 4992 scope.go:117] "RemoveContainer" containerID="8753fa560564bfd1040977e230bbe2518a47ce41d74b99b3cb19a0419fd04737" Jan 31 09:40:48 crc kubenswrapper[4992]: E0131 09:40:48.343636 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8753fa560564bfd1040977e230bbe2518a47ce41d74b99b3cb19a0419fd04737\": container with ID starting with 8753fa560564bfd1040977e230bbe2518a47ce41d74b99b3cb19a0419fd04737 not found: ID does not exist" containerID="8753fa560564bfd1040977e230bbe2518a47ce41d74b99b3cb19a0419fd04737" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.343660 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8753fa560564bfd1040977e230bbe2518a47ce41d74b99b3cb19a0419fd04737"} err="failed to get container status \"8753fa560564bfd1040977e230bbe2518a47ce41d74b99b3cb19a0419fd04737\": rpc error: code = NotFound desc = could not find container \"8753fa560564bfd1040977e230bbe2518a47ce41d74b99b3cb19a0419fd04737\": container with ID starting with 8753fa560564bfd1040977e230bbe2518a47ce41d74b99b3cb19a0419fd04737 not found: ID does not exist" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.343676 4992 scope.go:117] "RemoveContainer" containerID="16a98e9d5f4400c7455abc1c76073e9a9b39d7b0e7be325a3bd8ecc0d2f4c9f0" Jan 31 09:40:48 crc kubenswrapper[4992]: E0131 09:40:48.343934 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a98e9d5f4400c7455abc1c76073e9a9b39d7b0e7be325a3bd8ecc0d2f4c9f0\": container with ID starting with 16a98e9d5f4400c7455abc1c76073e9a9b39d7b0e7be325a3bd8ecc0d2f4c9f0 not found: ID does not exist" containerID="16a98e9d5f4400c7455abc1c76073e9a9b39d7b0e7be325a3bd8ecc0d2f4c9f0" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.343959 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a98e9d5f4400c7455abc1c76073e9a9b39d7b0e7be325a3bd8ecc0d2f4c9f0"} err="failed to get container status \"16a98e9d5f4400c7455abc1c76073e9a9b39d7b0e7be325a3bd8ecc0d2f4c9f0\": rpc error: code = NotFound desc = could not find container \"16a98e9d5f4400c7455abc1c76073e9a9b39d7b0e7be325a3bd8ecc0d2f4c9f0\": container with ID starting with 16a98e9d5f4400c7455abc1c76073e9a9b39d7b0e7be325a3bd8ecc0d2f4c9f0 not found: ID does not exist" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.382314 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdcfaec8-5faa-4311-8218-274d7d587693-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.382377 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdcfaec8-5faa-4311-8218-274d7d587693-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.382399 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpx25\" (UniqueName: \"kubernetes.io/projected/bdcfaec8-5faa-4311-8218-274d7d587693-kube-api-access-fpx25\") on node \"crc\" DevicePath \"\"" Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.625999 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-79cnm"] Jan 31 09:40:48 crc kubenswrapper[4992]: I0131 09:40:48.630147 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-79cnm"] Jan 31 09:40:49 crc kubenswrapper[4992]: I0131 09:40:49.190678 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdcfaec8-5faa-4311-8218-274d7d587693" path="/var/lib/kubelet/pods/bdcfaec8-5faa-4311-8218-274d7d587693/volumes" Jan 31 09:40:49 crc kubenswrapper[4992]: I0131 09:40:49.472767 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ncgdf"] Jan 31 09:40:50 crc kubenswrapper[4992]: I0131 09:40:50.311736 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ncgdf" podUID="78244768-db4f-451c-8e41-54332e43bc39" containerName="registry-server" containerID="cri-o://b8c82a024e2b9015eda05ec6eb058ba7fb602ced2b1d5559fc8560d2fe9453b6" gracePeriod=2 Jan 31 09:40:50 crc kubenswrapper[4992]: I0131 09:40:50.710352 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:50 crc kubenswrapper[4992]: I0131 09:40:50.811508 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78244768-db4f-451c-8e41-54332e43bc39-catalog-content\") pod \"78244768-db4f-451c-8e41-54332e43bc39\" (UID: \"78244768-db4f-451c-8e41-54332e43bc39\") " Jan 31 09:40:50 crc kubenswrapper[4992]: I0131 09:40:50.811570 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78244768-db4f-451c-8e41-54332e43bc39-utilities\") pod \"78244768-db4f-451c-8e41-54332e43bc39\" (UID: \"78244768-db4f-451c-8e41-54332e43bc39\") " Jan 31 09:40:50 crc kubenswrapper[4992]: I0131 09:40:50.811615 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89sc4\" (UniqueName: \"kubernetes.io/projected/78244768-db4f-451c-8e41-54332e43bc39-kube-api-access-89sc4\") pod \"78244768-db4f-451c-8e41-54332e43bc39\" (UID: \"78244768-db4f-451c-8e41-54332e43bc39\") " Jan 31 09:40:50 crc kubenswrapper[4992]: I0131 09:40:50.813042 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78244768-db4f-451c-8e41-54332e43bc39-utilities" (OuterVolumeSpecName: "utilities") pod "78244768-db4f-451c-8e41-54332e43bc39" (UID: "78244768-db4f-451c-8e41-54332e43bc39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:40:50 crc kubenswrapper[4992]: I0131 09:40:50.818548 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78244768-db4f-451c-8e41-54332e43bc39-kube-api-access-89sc4" (OuterVolumeSpecName: "kube-api-access-89sc4") pod "78244768-db4f-451c-8e41-54332e43bc39" (UID: "78244768-db4f-451c-8e41-54332e43bc39"). InnerVolumeSpecName "kube-api-access-89sc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:40:50 crc kubenswrapper[4992]: I0131 09:40:50.879408 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78244768-db4f-451c-8e41-54332e43bc39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78244768-db4f-451c-8e41-54332e43bc39" (UID: "78244768-db4f-451c-8e41-54332e43bc39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:40:50 crc kubenswrapper[4992]: I0131 09:40:50.913470 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78244768-db4f-451c-8e41-54332e43bc39-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:40:50 crc kubenswrapper[4992]: I0131 09:40:50.913518 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78244768-db4f-451c-8e41-54332e43bc39-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:40:50 crc kubenswrapper[4992]: I0131 09:40:50.913531 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89sc4\" (UniqueName: \"kubernetes.io/projected/78244768-db4f-451c-8e41-54332e43bc39-kube-api-access-89sc4\") on node \"crc\" DevicePath \"\"" Jan 31 09:40:51 crc kubenswrapper[4992]: I0131 09:40:51.322261 4992 generic.go:334] "Generic (PLEG): container finished" podID="78244768-db4f-451c-8e41-54332e43bc39" containerID="b8c82a024e2b9015eda05ec6eb058ba7fb602ced2b1d5559fc8560d2fe9453b6" exitCode=0 Jan 31 09:40:51 crc kubenswrapper[4992]: I0131 09:40:51.322329 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncgdf" event={"ID":"78244768-db4f-451c-8e41-54332e43bc39","Type":"ContainerDied","Data":"b8c82a024e2b9015eda05ec6eb058ba7fb602ced2b1d5559fc8560d2fe9453b6"} Jan 31 09:40:51 crc kubenswrapper[4992]: I0131 09:40:51.322384 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncgdf" event={"ID":"78244768-db4f-451c-8e41-54332e43bc39","Type":"ContainerDied","Data":"bc375985076219752f3724fa3621b080088abc47f87f9e3d49a38d15cc8b8631"} Jan 31 09:40:51 crc kubenswrapper[4992]: I0131 09:40:51.322467 4992 scope.go:117] "RemoveContainer" containerID="b8c82a024e2b9015eda05ec6eb058ba7fb602ced2b1d5559fc8560d2fe9453b6" Jan 31 09:40:51 crc kubenswrapper[4992]: I0131 09:40:51.323361 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncgdf" Jan 31 09:40:51 crc kubenswrapper[4992]: I0131 09:40:51.342481 4992 scope.go:117] "RemoveContainer" containerID="59df4885d5889ba8dcc5e5d8c4161d2666ffb5fe5479a119ebfc87fa4f99cc1d" Jan 31 09:40:51 crc kubenswrapper[4992]: I0131 09:40:51.351643 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ncgdf"] Jan 31 09:40:51 crc kubenswrapper[4992]: I0131 09:40:51.358713 4992 scope.go:117] "RemoveContainer" containerID="ffec450b70f6ed1ed772f19e5e75c1475868ec884376132541b75af63aacd9bd" Jan 31 09:40:51 crc kubenswrapper[4992]: I0131 09:40:51.359135 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ncgdf"] Jan 31 09:40:51 crc kubenswrapper[4992]: I0131 09:40:51.392492 4992 scope.go:117] "RemoveContainer" containerID="b8c82a024e2b9015eda05ec6eb058ba7fb602ced2b1d5559fc8560d2fe9453b6" Jan 31 09:40:51 crc kubenswrapper[4992]: E0131 09:40:51.392950 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8c82a024e2b9015eda05ec6eb058ba7fb602ced2b1d5559fc8560d2fe9453b6\": container with ID starting with b8c82a024e2b9015eda05ec6eb058ba7fb602ced2b1d5559fc8560d2fe9453b6 not found: ID does not exist" containerID="b8c82a024e2b9015eda05ec6eb058ba7fb602ced2b1d5559fc8560d2fe9453b6" Jan 31 09:40:51 crc kubenswrapper[4992]: I0131 09:40:51.392988 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8c82a024e2b9015eda05ec6eb058ba7fb602ced2b1d5559fc8560d2fe9453b6"} err="failed to get container status \"b8c82a024e2b9015eda05ec6eb058ba7fb602ced2b1d5559fc8560d2fe9453b6\": rpc error: code = NotFound desc = could not find container \"b8c82a024e2b9015eda05ec6eb058ba7fb602ced2b1d5559fc8560d2fe9453b6\": container with ID starting with b8c82a024e2b9015eda05ec6eb058ba7fb602ced2b1d5559fc8560d2fe9453b6 not found: ID does not exist" Jan 31 09:40:51 crc kubenswrapper[4992]: I0131 09:40:51.393009 4992 scope.go:117] "RemoveContainer" containerID="59df4885d5889ba8dcc5e5d8c4161d2666ffb5fe5479a119ebfc87fa4f99cc1d" Jan 31 09:40:51 crc kubenswrapper[4992]: E0131 09:40:51.393495 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59df4885d5889ba8dcc5e5d8c4161d2666ffb5fe5479a119ebfc87fa4f99cc1d\": container with ID starting with 59df4885d5889ba8dcc5e5d8c4161d2666ffb5fe5479a119ebfc87fa4f99cc1d not found: ID does not exist" containerID="59df4885d5889ba8dcc5e5d8c4161d2666ffb5fe5479a119ebfc87fa4f99cc1d" Jan 31 09:40:51 crc kubenswrapper[4992]: I0131 09:40:51.393521 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59df4885d5889ba8dcc5e5d8c4161d2666ffb5fe5479a119ebfc87fa4f99cc1d"} err="failed to get container status \"59df4885d5889ba8dcc5e5d8c4161d2666ffb5fe5479a119ebfc87fa4f99cc1d\": rpc error: code = NotFound desc = could not find container \"59df4885d5889ba8dcc5e5d8c4161d2666ffb5fe5479a119ebfc87fa4f99cc1d\": container with ID starting with 59df4885d5889ba8dcc5e5d8c4161d2666ffb5fe5479a119ebfc87fa4f99cc1d not found: ID does not exist" Jan 31 09:40:51 crc kubenswrapper[4992]: I0131 09:40:51.393540 4992 scope.go:117] "RemoveContainer" containerID="ffec450b70f6ed1ed772f19e5e75c1475868ec884376132541b75af63aacd9bd" Jan 31 09:40:51 crc kubenswrapper[4992]: E0131 09:40:51.393804 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffec450b70f6ed1ed772f19e5e75c1475868ec884376132541b75af63aacd9bd\": container with ID starting with ffec450b70f6ed1ed772f19e5e75c1475868ec884376132541b75af63aacd9bd not found: ID does not exist" containerID="ffec450b70f6ed1ed772f19e5e75c1475868ec884376132541b75af63aacd9bd" Jan 31 09:40:51 crc kubenswrapper[4992]: I0131 09:40:51.393830 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffec450b70f6ed1ed772f19e5e75c1475868ec884376132541b75af63aacd9bd"} err="failed to get container status \"ffec450b70f6ed1ed772f19e5e75c1475868ec884376132541b75af63aacd9bd\": rpc error: code = NotFound desc = could not find container \"ffec450b70f6ed1ed772f19e5e75c1475868ec884376132541b75af63aacd9bd\": container with ID starting with ffec450b70f6ed1ed772f19e5e75c1475868ec884376132541b75af63aacd9bd not found: ID does not exist" Jan 31 09:40:52 crc kubenswrapper[4992]: I0131 09:40:52.996065 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-jntmw" Jan 31 09:40:52 crc kubenswrapper[4992]: I0131 09:40:52.996131 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-jntmw" Jan 31 09:40:53 crc kubenswrapper[4992]: I0131 09:40:53.021039 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-jntmw" Jan 31 09:40:53 crc kubenswrapper[4992]: I0131 09:40:53.201731 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78244768-db4f-451c-8e41-54332e43bc39" path="/var/lib/kubelet/pods/78244768-db4f-451c-8e41-54332e43bc39/volumes" Jan 31 09:40:53 crc kubenswrapper[4992]: I0131 09:40:53.368129 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-jntmw" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.708963 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq"] Jan 31 09:40:55 crc kubenswrapper[4992]: E0131 09:40:55.709475 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78244768-db4f-451c-8e41-54332e43bc39" containerName="extract-content" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.709492 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="78244768-db4f-451c-8e41-54332e43bc39" containerName="extract-content" Jan 31 09:40:55 crc kubenswrapper[4992]: E0131 09:40:55.709515 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdcfaec8-5faa-4311-8218-274d7d587693" containerName="extract-content" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.709523 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdcfaec8-5faa-4311-8218-274d7d587693" containerName="extract-content" Jan 31 09:40:55 crc kubenswrapper[4992]: E0131 09:40:55.709539 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdcfaec8-5faa-4311-8218-274d7d587693" containerName="extract-utilities" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.709548 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdcfaec8-5faa-4311-8218-274d7d587693" containerName="extract-utilities" Jan 31 09:40:55 crc kubenswrapper[4992]: E0131 09:40:55.709559 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdcfaec8-5faa-4311-8218-274d7d587693" containerName="registry-server" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.709568 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdcfaec8-5faa-4311-8218-274d7d587693" containerName="registry-server" Jan 31 09:40:55 crc kubenswrapper[4992]: E0131 09:40:55.709585 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78244768-db4f-451c-8e41-54332e43bc39" containerName="extract-utilities" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.709594 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="78244768-db4f-451c-8e41-54332e43bc39" containerName="extract-utilities" Jan 31 09:40:55 crc kubenswrapper[4992]: E0131 09:40:55.709604 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78244768-db4f-451c-8e41-54332e43bc39" containerName="registry-server" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.709613 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="78244768-db4f-451c-8e41-54332e43bc39" containerName="registry-server" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.709739 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="78244768-db4f-451c-8e41-54332e43bc39" containerName="registry-server" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.709756 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdcfaec8-5faa-4311-8218-274d7d587693" containerName="registry-server" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.710790 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.712994 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-zq65f" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.715344 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq"] Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.778092 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/876252a0-4bc3-4deb-808b-16af91439ae7-bundle\") pod \"74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq\" (UID: \"876252a0-4bc3-4deb-808b-16af91439ae7\") " pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.778187 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/876252a0-4bc3-4deb-808b-16af91439ae7-util\") pod \"74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq\" (UID: \"876252a0-4bc3-4deb-808b-16af91439ae7\") " pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.778290 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpcsp\" (UniqueName: \"kubernetes.io/projected/876252a0-4bc3-4deb-808b-16af91439ae7-kube-api-access-qpcsp\") pod \"74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq\" (UID: \"876252a0-4bc3-4deb-808b-16af91439ae7\") " pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.879974 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/876252a0-4bc3-4deb-808b-16af91439ae7-bundle\") pod \"74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq\" (UID: \"876252a0-4bc3-4deb-808b-16af91439ae7\") " pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.880034 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/876252a0-4bc3-4deb-808b-16af91439ae7-util\") pod \"74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq\" (UID: \"876252a0-4bc3-4deb-808b-16af91439ae7\") " pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.880102 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpcsp\" (UniqueName: \"kubernetes.io/projected/876252a0-4bc3-4deb-808b-16af91439ae7-kube-api-access-qpcsp\") pod \"74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq\" (UID: \"876252a0-4bc3-4deb-808b-16af91439ae7\") " pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.880616 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/876252a0-4bc3-4deb-808b-16af91439ae7-bundle\") pod \"74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq\" (UID: \"876252a0-4bc3-4deb-808b-16af91439ae7\") " pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.880718 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/876252a0-4bc3-4deb-808b-16af91439ae7-util\") pod \"74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq\" (UID: \"876252a0-4bc3-4deb-808b-16af91439ae7\") " pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" Jan 31 09:40:55 crc kubenswrapper[4992]: I0131 09:40:55.900477 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpcsp\" (UniqueName: \"kubernetes.io/projected/876252a0-4bc3-4deb-808b-16af91439ae7-kube-api-access-qpcsp\") pod \"74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq\" (UID: \"876252a0-4bc3-4deb-808b-16af91439ae7\") " pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" Jan 31 09:40:56 crc kubenswrapper[4992]: I0131 09:40:56.059901 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" Jan 31 09:40:56 crc kubenswrapper[4992]: I0131 09:40:56.266918 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq"] Jan 31 09:40:56 crc kubenswrapper[4992]: I0131 09:40:56.362292 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" event={"ID":"876252a0-4bc3-4deb-808b-16af91439ae7","Type":"ContainerStarted","Data":"804fabe3b30aa9e6226d629fb9532f322e7870b4679db9e1a49de261bcd66189"} Jan 31 09:40:57 crc kubenswrapper[4992]: I0131 09:40:57.369151 4992 generic.go:334] "Generic (PLEG): container finished" podID="876252a0-4bc3-4deb-808b-16af91439ae7" containerID="d12f8489999e0bbfc3ae20c0aaf6dfcefb5db1e668f16a680b60dd91d4004cf8" exitCode=0 Jan 31 09:40:57 crc kubenswrapper[4992]: I0131 09:40:57.369200 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" event={"ID":"876252a0-4bc3-4deb-808b-16af91439ae7","Type":"ContainerDied","Data":"d12f8489999e0bbfc3ae20c0aaf6dfcefb5db1e668f16a680b60dd91d4004cf8"} Jan 31 09:40:58 crc kubenswrapper[4992]: I0131 09:40:58.376064 4992 generic.go:334] "Generic (PLEG): container finished" podID="876252a0-4bc3-4deb-808b-16af91439ae7" containerID="c36fd34f096c6102cc919460aadb350746c7460fc30e0ec1e0c3b7d7d4bbada8" exitCode=0 Jan 31 09:40:58 crc kubenswrapper[4992]: I0131 09:40:58.376154 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" event={"ID":"876252a0-4bc3-4deb-808b-16af91439ae7","Type":"ContainerDied","Data":"c36fd34f096c6102cc919460aadb350746c7460fc30e0ec1e0c3b7d7d4bbada8"} Jan 31 09:40:59 crc kubenswrapper[4992]: I0131 09:40:59.383608 4992 generic.go:334] "Generic (PLEG): container finished" podID="876252a0-4bc3-4deb-808b-16af91439ae7" containerID="4dad06de292537282edca8ee442a244ead9134cf35a9e660e5022d7e3aeee695" exitCode=0 Jan 31 09:40:59 crc kubenswrapper[4992]: I0131 09:40:59.383654 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" event={"ID":"876252a0-4bc3-4deb-808b-16af91439ae7","Type":"ContainerDied","Data":"4dad06de292537282edca8ee442a244ead9134cf35a9e660e5022d7e3aeee695"} Jan 31 09:41:00 crc kubenswrapper[4992]: I0131 09:41:00.614103 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" Jan 31 09:41:00 crc kubenswrapper[4992]: I0131 09:41:00.743968 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/876252a0-4bc3-4deb-808b-16af91439ae7-bundle\") pod \"876252a0-4bc3-4deb-808b-16af91439ae7\" (UID: \"876252a0-4bc3-4deb-808b-16af91439ae7\") " Jan 31 09:41:00 crc kubenswrapper[4992]: I0131 09:41:00.744055 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/876252a0-4bc3-4deb-808b-16af91439ae7-util\") pod \"876252a0-4bc3-4deb-808b-16af91439ae7\" (UID: \"876252a0-4bc3-4deb-808b-16af91439ae7\") " Jan 31 09:41:00 crc kubenswrapper[4992]: I0131 09:41:00.744122 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpcsp\" (UniqueName: \"kubernetes.io/projected/876252a0-4bc3-4deb-808b-16af91439ae7-kube-api-access-qpcsp\") pod \"876252a0-4bc3-4deb-808b-16af91439ae7\" (UID: \"876252a0-4bc3-4deb-808b-16af91439ae7\") " Jan 31 09:41:00 crc kubenswrapper[4992]: I0131 09:41:00.744984 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/876252a0-4bc3-4deb-808b-16af91439ae7-bundle" (OuterVolumeSpecName: "bundle") pod "876252a0-4bc3-4deb-808b-16af91439ae7" (UID: "876252a0-4bc3-4deb-808b-16af91439ae7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:41:00 crc kubenswrapper[4992]: I0131 09:41:00.749604 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/876252a0-4bc3-4deb-808b-16af91439ae7-kube-api-access-qpcsp" (OuterVolumeSpecName: "kube-api-access-qpcsp") pod "876252a0-4bc3-4deb-808b-16af91439ae7" (UID: "876252a0-4bc3-4deb-808b-16af91439ae7"). InnerVolumeSpecName "kube-api-access-qpcsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:41:00 crc kubenswrapper[4992]: I0131 09:41:00.760410 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/876252a0-4bc3-4deb-808b-16af91439ae7-util" (OuterVolumeSpecName: "util") pod "876252a0-4bc3-4deb-808b-16af91439ae7" (UID: "876252a0-4bc3-4deb-808b-16af91439ae7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:41:00 crc kubenswrapper[4992]: I0131 09:41:00.845407 4992 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/876252a0-4bc3-4deb-808b-16af91439ae7-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:41:00 crc kubenswrapper[4992]: I0131 09:41:00.845468 4992 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/876252a0-4bc3-4deb-808b-16af91439ae7-util\") on node \"crc\" DevicePath \"\"" Jan 31 09:41:00 crc kubenswrapper[4992]: I0131 09:41:00.845483 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpcsp\" (UniqueName: \"kubernetes.io/projected/876252a0-4bc3-4deb-808b-16af91439ae7-kube-api-access-qpcsp\") on node \"crc\" DevicePath \"\"" Jan 31 09:41:01 crc kubenswrapper[4992]: I0131 09:41:01.396809 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" event={"ID":"876252a0-4bc3-4deb-808b-16af91439ae7","Type":"ContainerDied","Data":"804fabe3b30aa9e6226d629fb9532f322e7870b4679db9e1a49de261bcd66189"} Jan 31 09:41:01 crc kubenswrapper[4992]: I0131 09:41:01.397146 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="804fabe3b30aa9e6226d629fb9532f322e7870b4679db9e1a49de261bcd66189" Jan 31 09:41:01 crc kubenswrapper[4992]: I0131 09:41:01.396994 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq" Jan 31 09:41:04 crc kubenswrapper[4992]: I0131 09:41:04.486362 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-76c496b575-d66w5"] Jan 31 09:41:04 crc kubenswrapper[4992]: E0131 09:41:04.486951 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="876252a0-4bc3-4deb-808b-16af91439ae7" containerName="pull" Jan 31 09:41:04 crc kubenswrapper[4992]: I0131 09:41:04.486969 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="876252a0-4bc3-4deb-808b-16af91439ae7" containerName="pull" Jan 31 09:41:04 crc kubenswrapper[4992]: E0131 09:41:04.486996 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="876252a0-4bc3-4deb-808b-16af91439ae7" containerName="extract" Jan 31 09:41:04 crc kubenswrapper[4992]: I0131 09:41:04.487004 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="876252a0-4bc3-4deb-808b-16af91439ae7" containerName="extract" Jan 31 09:41:04 crc kubenswrapper[4992]: E0131 09:41:04.487017 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="876252a0-4bc3-4deb-808b-16af91439ae7" containerName="util" Jan 31 09:41:04 crc kubenswrapper[4992]: I0131 09:41:04.487027 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="876252a0-4bc3-4deb-808b-16af91439ae7" containerName="util" Jan 31 09:41:04 crc kubenswrapper[4992]: I0131 09:41:04.487155 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="876252a0-4bc3-4deb-808b-16af91439ae7" containerName="extract" Jan 31 09:41:04 crc kubenswrapper[4992]: I0131 09:41:04.487693 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-76c496b575-d66w5" Jan 31 09:41:04 crc kubenswrapper[4992]: I0131 09:41:04.490311 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-n6p8x" Jan 31 09:41:04 crc kubenswrapper[4992]: I0131 09:41:04.511366 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-76c496b575-d66w5"] Jan 31 09:41:04 crc kubenswrapper[4992]: I0131 09:41:04.593234 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blpl6\" (UniqueName: \"kubernetes.io/projected/8391a985-63ab-4b6a-89e9-9268c8e70e81-kube-api-access-blpl6\") pod \"openstack-operator-controller-init-76c496b575-d66w5\" (UID: \"8391a985-63ab-4b6a-89e9-9268c8e70e81\") " pod="openstack-operators/openstack-operator-controller-init-76c496b575-d66w5" Jan 31 09:41:04 crc kubenswrapper[4992]: I0131 09:41:04.694086 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blpl6\" (UniqueName: \"kubernetes.io/projected/8391a985-63ab-4b6a-89e9-9268c8e70e81-kube-api-access-blpl6\") pod \"openstack-operator-controller-init-76c496b575-d66w5\" (UID: \"8391a985-63ab-4b6a-89e9-9268c8e70e81\") " pod="openstack-operators/openstack-operator-controller-init-76c496b575-d66w5" Jan 31 09:41:04 crc kubenswrapper[4992]: I0131 09:41:04.712579 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blpl6\" (UniqueName: \"kubernetes.io/projected/8391a985-63ab-4b6a-89e9-9268c8e70e81-kube-api-access-blpl6\") pod \"openstack-operator-controller-init-76c496b575-d66w5\" (UID: \"8391a985-63ab-4b6a-89e9-9268c8e70e81\") " pod="openstack-operators/openstack-operator-controller-init-76c496b575-d66w5" Jan 31 09:41:04 crc kubenswrapper[4992]: I0131 09:41:04.820029 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-76c496b575-d66w5" Jan 31 09:41:05 crc kubenswrapper[4992]: I0131 09:41:05.248367 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-76c496b575-d66w5"] Jan 31 09:41:05 crc kubenswrapper[4992]: I0131 09:41:05.421665 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-76c496b575-d66w5" event={"ID":"8391a985-63ab-4b6a-89e9-9268c8e70e81","Type":"ContainerStarted","Data":"3ddce8e4cf5bddf88c2e042d042fa8b614d991af2a99364ed1bc1ea9eb65b0bb"} Jan 31 09:41:10 crc kubenswrapper[4992]: I0131 09:41:10.457303 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-76c496b575-d66w5" event={"ID":"8391a985-63ab-4b6a-89e9-9268c8e70e81","Type":"ContainerStarted","Data":"e428ea5c9b3c7a783faa4703f82323559a2c675e48ab859627c3888462830e5b"} Jan 31 09:41:10 crc kubenswrapper[4992]: I0131 09:41:10.458074 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-76c496b575-d66w5" Jan 31 09:41:10 crc kubenswrapper[4992]: I0131 09:41:10.492251 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-76c496b575-d66w5" podStartSLOduration=2.390045174 podStartE2EDuration="6.492215111s" podCreationTimestamp="2026-01-31 09:41:04 +0000 UTC" firstStartedPulling="2026-01-31 09:41:05.269132095 +0000 UTC m=+961.240524082" lastFinishedPulling="2026-01-31 09:41:09.371302032 +0000 UTC m=+965.342694019" observedRunningTime="2026-01-31 09:41:10.486100624 +0000 UTC m=+966.457492631" watchObservedRunningTime="2026-01-31 09:41:10.492215111 +0000 UTC m=+966.463607098" Jan 31 09:41:14 crc kubenswrapper[4992]: I0131 09:41:14.823489 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-76c496b575-d66w5" Jan 31 09:41:15 crc kubenswrapper[4992]: I0131 09:41:15.434528 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:41:15 crc kubenswrapper[4992]: I0131 09:41:15.434629 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.818375 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-xvpg7"] Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.819968 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-xvpg7" Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.823887 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-znlmx" Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.831241 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-xvpg7"] Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.842203 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-fjhnl"] Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.843282 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-fjhnl" Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.846275 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-k6jzv" Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.851051 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-p5bv7"] Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.851948 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-p5bv7" Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.855547 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-v4mqp" Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.867347 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-fjhnl"] Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.886639 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-9qxn8"] Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.899533 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9qxn8" Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.906447 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-h6g4m" Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.924127 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8shjr\" (UniqueName: \"kubernetes.io/projected/93e6253c-64e6-458c-a400-c9587c015da2-kube-api-access-8shjr\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-xvpg7\" (UID: \"93e6253c-64e6-458c-a400-c9587c015da2\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-xvpg7" Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.929564 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-p5bv7"] Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.952541 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-mgfln"] Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.953686 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-mgfln" Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.958542 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-dvb4x" Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.969531 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-jtscq"] Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.970490 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-jtscq" Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.978616 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qgfzx" Jan 31 09:41:33 crc kubenswrapper[4992]: I0131 09:41:33.988313 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-9qxn8"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.017119 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-jtscq"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.027045 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh57g\" (UniqueName: \"kubernetes.io/projected/0237347b-28b1-42cb-b0c3-3d0cdb660846-kube-api-access-vh57g\") pod \"designate-operator-controller-manager-6d9697b7f4-p5bv7\" (UID: \"0237347b-28b1-42cb-b0c3-3d0cdb660846\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-p5bv7" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.027112 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmrwh\" (UniqueName: \"kubernetes.io/projected/3a5af5db-f87a-4df8-bb90-81b56a48ec32-kube-api-access-kmrwh\") pod \"cinder-operator-controller-manager-8d874c8fc-fjhnl\" (UID: \"3a5af5db-f87a-4df8-bb90-81b56a48ec32\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-fjhnl" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.027160 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8shjr\" (UniqueName: \"kubernetes.io/projected/93e6253c-64e6-458c-a400-c9587c015da2-kube-api-access-8shjr\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-xvpg7\" (UID: \"93e6253c-64e6-458c-a400-c9587c015da2\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-xvpg7" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.027182 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5k2s\" (UniqueName: \"kubernetes.io/projected/429d0324-062a-4afe-88e7-ff0fc725b5fc-kube-api-access-c5k2s\") pod \"heat-operator-controller-manager-69d6db494d-mgfln\" (UID: \"429d0324-062a-4afe-88e7-ff0fc725b5fc\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-mgfln" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.027203 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn2r6\" (UniqueName: \"kubernetes.io/projected/c696cc9f-2c26-4f46-8a14-fc28c65d1855-kube-api-access-pn2r6\") pod \"horizon-operator-controller-manager-5fb775575f-jtscq\" (UID: \"c696cc9f-2c26-4f46-8a14-fc28c65d1855\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-jtscq" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.027225 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl2mp\" (UniqueName: \"kubernetes.io/projected/f03e2ba4-3389-499d-866f-8349a5a41b9e-kube-api-access-hl2mp\") pod \"glance-operator-controller-manager-8886f4c47-9qxn8\" (UID: \"f03e2ba4-3389-499d-866f-8349a5a41b9e\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9qxn8" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.027647 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-mgfln"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.038578 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-98f7s"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.039336 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.041384 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-fgt9z"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.041860 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.042063 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-fgt9z" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.042093 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kjhl4" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.043523 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-kcbrh" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.054485 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.055234 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8shjr\" (UniqueName: \"kubernetes.io/projected/93e6253c-64e6-458c-a400-c9587c015da2-kube-api-access-8shjr\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-xvpg7\" (UID: \"93e6253c-64e6-458c-a400-c9587c015da2\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-xvpg7" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.055457 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.059714 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-kqs5z"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.060700 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-kqs5z" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.066902 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-vbpfh" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.067090 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-hrrrk" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.070551 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-98f7s"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.075166 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.082606 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-fgt9z"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.085291 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-kqs5z"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.106442 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-js4c9"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.107449 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-js4c9" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.107845 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-js4c9"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.112275 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-vg628" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.115392 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-pqvpw"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.116329 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-pqvpw" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.124316 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-ltpw8" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.128147 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert\") pod \"infra-operator-controller-manager-79955696d6-98f7s\" (UID: \"d8bbd2ef-463b-430b-8332-a6f48ea54e75\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.128189 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppsld\" (UniqueName: \"kubernetes.io/projected/56f8199c-cece-4c37-a1ab-87f1eae9bd83-kube-api-access-ppsld\") pod \"neutron-operator-controller-manager-585dbc889-pqvpw\" (UID: \"56f8199c-cece-4c37-a1ab-87f1eae9bd83\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-pqvpw" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.128214 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh57g\" (UniqueName: \"kubernetes.io/projected/0237347b-28b1-42cb-b0c3-3d0cdb660846-kube-api-access-vh57g\") pod \"designate-operator-controller-manager-6d9697b7f4-p5bv7\" (UID: \"0237347b-28b1-42cb-b0c3-3d0cdb660846\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-p5bv7" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.128247 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmj9p\" (UniqueName: \"kubernetes.io/projected/d8bbd2ef-463b-430b-8332-a6f48ea54e75-kube-api-access-tmj9p\") pod \"infra-operator-controller-manager-79955696d6-98f7s\" (UID: \"d8bbd2ef-463b-430b-8332-a6f48ea54e75\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.128273 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmrwh\" (UniqueName: \"kubernetes.io/projected/3a5af5db-f87a-4df8-bb90-81b56a48ec32-kube-api-access-kmrwh\") pod \"cinder-operator-controller-manager-8d874c8fc-fjhnl\" (UID: \"3a5af5db-f87a-4df8-bb90-81b56a48ec32\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-fjhnl" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.128300 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cn4z\" (UniqueName: \"kubernetes.io/projected/8530365a-5ca4-4aae-b0cd-7f2cf9e4fb58-kube-api-access-4cn4z\") pod \"ironic-operator-controller-manager-5f4b8bd54d-fgt9z\" (UID: \"8530365a-5ca4-4aae-b0cd-7f2cf9e4fb58\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-fgt9z" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.128322 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nphs\" (UniqueName: \"kubernetes.io/projected/99af6e66-0ebf-4bb3-a943-d106e624df65-kube-api-access-6nphs\") pod \"keystone-operator-controller-manager-84f48565d4-w2697\" (UID: \"99af6e66-0ebf-4bb3-a943-d106e624df65\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.128344 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd98b\" (UniqueName: \"kubernetes.io/projected/1052ce64-cfa7-4fd8-be68-849dc5cfc74f-kube-api-access-fd98b\") pod \"mariadb-operator-controller-manager-67bf948998-js4c9\" (UID: \"1052ce64-cfa7-4fd8-be68-849dc5cfc74f\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-js4c9" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.128370 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqhx9\" (UniqueName: \"kubernetes.io/projected/f74c9115-a924-4896-a04d-4f523eadafa7-kube-api-access-lqhx9\") pod \"manila-operator-controller-manager-7dd968899f-kqs5z\" (UID: \"f74c9115-a924-4896-a04d-4f523eadafa7\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-kqs5z" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.128394 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5k2s\" (UniqueName: \"kubernetes.io/projected/429d0324-062a-4afe-88e7-ff0fc725b5fc-kube-api-access-c5k2s\") pod \"heat-operator-controller-manager-69d6db494d-mgfln\" (UID: \"429d0324-062a-4afe-88e7-ff0fc725b5fc\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-mgfln" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.128430 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn2r6\" (UniqueName: \"kubernetes.io/projected/c696cc9f-2c26-4f46-8a14-fc28c65d1855-kube-api-access-pn2r6\") pod \"horizon-operator-controller-manager-5fb775575f-jtscq\" (UID: \"c696cc9f-2c26-4f46-8a14-fc28c65d1855\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-jtscq" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.128454 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl2mp\" (UniqueName: \"kubernetes.io/projected/f03e2ba4-3389-499d-866f-8349a5a41b9e-kube-api-access-hl2mp\") pod \"glance-operator-controller-manager-8886f4c47-9qxn8\" (UID: \"f03e2ba4-3389-499d-866f-8349a5a41b9e\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9qxn8" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.137040 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-pqvpw"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.148317 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-wxxrr"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.149189 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wxxrr" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.149581 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-xvpg7" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.150595 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn2r6\" (UniqueName: \"kubernetes.io/projected/c696cc9f-2c26-4f46-8a14-fc28c65d1855-kube-api-access-pn2r6\") pod \"horizon-operator-controller-manager-5fb775575f-jtscq\" (UID: \"c696cc9f-2c26-4f46-8a14-fc28c65d1855\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-jtscq" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.157203 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.158165 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.162019 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmrwh\" (UniqueName: \"kubernetes.io/projected/3a5af5db-f87a-4df8-bb90-81b56a48ec32-kube-api-access-kmrwh\") pod \"cinder-operator-controller-manager-8d874c8fc-fjhnl\" (UID: \"3a5af5db-f87a-4df8-bb90-81b56a48ec32\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-fjhnl" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.162203 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5k2s\" (UniqueName: \"kubernetes.io/projected/429d0324-062a-4afe-88e7-ff0fc725b5fc-kube-api-access-c5k2s\") pod \"heat-operator-controller-manager-69d6db494d-mgfln\" (UID: \"429d0324-062a-4afe-88e7-ff0fc725b5fc\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-mgfln" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.163216 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl2mp\" (UniqueName: \"kubernetes.io/projected/f03e2ba4-3389-499d-866f-8349a5a41b9e-kube-api-access-hl2mp\") pod \"glance-operator-controller-manager-8886f4c47-9qxn8\" (UID: \"f03e2ba4-3389-499d-866f-8349a5a41b9e\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9qxn8" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.166552 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-wxxrr"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.167289 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-fjhnl" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.167802 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-czzjz" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.167966 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-h26gc" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.176873 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.189589 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.190613 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.195186 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh57g\" (UniqueName: \"kubernetes.io/projected/0237347b-28b1-42cb-b0c3-3d0cdb660846-kube-api-access-vh57g\") pod \"designate-operator-controller-manager-6d9697b7f4-p5bv7\" (UID: \"0237347b-28b1-42cb-b0c3-3d0cdb660846\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-p5bv7" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.197036 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vlsp8" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.200061 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.201494 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.203491 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.203779 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-zn6k9" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.203986 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.208620 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.209548 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.212130 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.217394 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-qg4c6"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.218327 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-qg4c6" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.226117 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-54pwc" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.226216 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-6hhzn" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.228992 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert\") pod \"infra-operator-controller-manager-79955696d6-98f7s\" (UID: \"d8bbd2ef-463b-430b-8332-a6f48ea54e75\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.229162 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppsld\" (UniqueName: \"kubernetes.io/projected/56f8199c-cece-4c37-a1ab-87f1eae9bd83-kube-api-access-ppsld\") pod \"neutron-operator-controller-manager-585dbc889-pqvpw\" (UID: \"56f8199c-cece-4c37-a1ab-87f1eae9bd83\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-pqvpw" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.229288 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kbcm\" (UniqueName: \"kubernetes.io/projected/aefa948a-54b2-4cff-8934-2af2310b73da-kube-api-access-6kbcm\") pod \"swift-operator-controller-manager-68fc8c869-qg4c6\" (UID: \"aefa948a-54b2-4cff-8934-2af2310b73da\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-qg4c6" Jan 31 09:41:34 crc kubenswrapper[4992]: E0131 09:41:34.230071 4992 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 31 09:41:34 crc kubenswrapper[4992]: E0131 09:41:34.230128 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert podName:d8bbd2ef-463b-430b-8332-a6f48ea54e75 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:34.730110078 +0000 UTC m=+990.701502065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert") pod "infra-operator-controller-manager-79955696d6-98f7s" (UID: "d8bbd2ef-463b-430b-8332-a6f48ea54e75") : secret "infra-operator-webhook-server-cert" not found Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.230702 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmj9p\" (UniqueName: \"kubernetes.io/projected/d8bbd2ef-463b-430b-8332-a6f48ea54e75-kube-api-access-tmj9p\") pod \"infra-operator-controller-manager-79955696d6-98f7s\" (UID: \"d8bbd2ef-463b-430b-8332-a6f48ea54e75\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.230762 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cn4z\" (UniqueName: \"kubernetes.io/projected/8530365a-5ca4-4aae-b0cd-7f2cf9e4fb58-kube-api-access-4cn4z\") pod \"ironic-operator-controller-manager-5f4b8bd54d-fgt9z\" (UID: \"8530365a-5ca4-4aae-b0cd-7f2cf9e4fb58\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-fgt9z" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.230791 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nphs\" (UniqueName: \"kubernetes.io/projected/99af6e66-0ebf-4bb3-a943-d106e624df65-kube-api-access-6nphs\") pod \"keystone-operator-controller-manager-84f48565d4-w2697\" (UID: \"99af6e66-0ebf-4bb3-a943-d106e624df65\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.230818 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd98b\" (UniqueName: \"kubernetes.io/projected/1052ce64-cfa7-4fd8-be68-849dc5cfc74f-kube-api-access-fd98b\") pod \"mariadb-operator-controller-manager-67bf948998-js4c9\" (UID: \"1052ce64-cfa7-4fd8-be68-849dc5cfc74f\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-js4c9" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.230842 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqhx9\" (UniqueName: \"kubernetes.io/projected/f74c9115-a924-4896-a04d-4f523eadafa7-kube-api-access-lqhx9\") pod \"manila-operator-controller-manager-7dd968899f-kqs5z\" (UID: \"f74c9115-a924-4896-a04d-4f523eadafa7\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-kqs5z" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.232697 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-qg4c6"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.258561 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.263287 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9qxn8" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.268965 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nphs\" (UniqueName: \"kubernetes.io/projected/99af6e66-0ebf-4bb3-a943-d106e624df65-kube-api-access-6nphs\") pod \"keystone-operator-controller-manager-84f48565d4-w2697\" (UID: \"99af6e66-0ebf-4bb3-a943-d106e624df65\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.275151 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-mgfln" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.285832 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqhx9\" (UniqueName: \"kubernetes.io/projected/f74c9115-a924-4896-a04d-4f523eadafa7-kube-api-access-lqhx9\") pod \"manila-operator-controller-manager-7dd968899f-kqs5z\" (UID: \"f74c9115-a924-4896-a04d-4f523eadafa7\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-kqs5z" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.286996 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd98b\" (UniqueName: \"kubernetes.io/projected/1052ce64-cfa7-4fd8-be68-849dc5cfc74f-kube-api-access-fd98b\") pod \"mariadb-operator-controller-manager-67bf948998-js4c9\" (UID: \"1052ce64-cfa7-4fd8-be68-849dc5cfc74f\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-js4c9" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.296052 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-jtscq" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.296105 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cn4z\" (UniqueName: \"kubernetes.io/projected/8530365a-5ca4-4aae-b0cd-7f2cf9e4fb58-kube-api-access-4cn4z\") pod \"ironic-operator-controller-manager-5f4b8bd54d-fgt9z\" (UID: \"8530365a-5ca4-4aae-b0cd-7f2cf9e4fb58\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-fgt9z" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.298381 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppsld\" (UniqueName: \"kubernetes.io/projected/56f8199c-cece-4c37-a1ab-87f1eae9bd83-kube-api-access-ppsld\") pod \"neutron-operator-controller-manager-585dbc889-pqvpw\" (UID: \"56f8199c-cece-4c37-a1ab-87f1eae9bd83\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-pqvpw" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.302930 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmj9p\" (UniqueName: \"kubernetes.io/projected/d8bbd2ef-463b-430b-8332-a6f48ea54e75-kube-api-access-tmj9p\") pod \"infra-operator-controller-manager-79955696d6-98f7s\" (UID: \"d8bbd2ef-463b-430b-8332-a6f48ea54e75\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.311326 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-pqvpw" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.332599 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjdwx\" (UniqueName: \"kubernetes.io/projected/692b4d55-c298-4489-9381-73e748a0bc5d-kube-api-access-hjdwx\") pod \"placement-operator-controller-manager-5b964cf4cd-4jkf5\" (UID: \"692b4d55-c298-4489-9381-73e748a0bc5d\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.337985 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4\" (UID: \"185c6250-c2c7-4ff3-bfe6-6449f78269f2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.339944 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-662nd\" (UniqueName: \"kubernetes.io/projected/b14e0dd4-9405-4358-802c-631e838b0746-kube-api-access-662nd\") pod \"nova-operator-controller-manager-55bff696bd-wxxrr\" (UID: \"b14e0dd4-9405-4358-802c-631e838b0746\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wxxrr" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.340036 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x26kn\" (UniqueName: \"kubernetes.io/projected/5db9c6d8-aa24-4750-9fa1-6b961574dddb-kube-api-access-x26kn\") pod \"ovn-operator-controller-manager-788c46999f-d56qf\" (UID: \"5db9c6d8-aa24-4750-9fa1-6b961574dddb\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.340086 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cts86\" (UniqueName: \"kubernetes.io/projected/a24dcaba-7ba8-45f1-9d82-6d080be373c8-kube-api-access-cts86\") pod \"octavia-operator-controller-manager-6687f8d877-2j9gv\" (UID: \"a24dcaba-7ba8-45f1-9d82-6d080be373c8\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.340167 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncjft\" (UniqueName: \"kubernetes.io/projected/185c6250-c2c7-4ff3-bfe6-6449f78269f2-kube-api-access-ncjft\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4\" (UID: \"185c6250-c2c7-4ff3-bfe6-6449f78269f2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.340298 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kbcm\" (UniqueName: \"kubernetes.io/projected/aefa948a-54b2-4cff-8934-2af2310b73da-kube-api-access-6kbcm\") pod \"swift-operator-controller-manager-68fc8c869-qg4c6\" (UID: \"aefa948a-54b2-4cff-8934-2af2310b73da\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-qg4c6" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.355854 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.356675 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.356745 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.360681 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-xh44h" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.370155 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kbcm\" (UniqueName: \"kubernetes.io/projected/aefa948a-54b2-4cff-8934-2af2310b73da-kube-api-access-6kbcm\") pod \"swift-operator-controller-manager-68fc8c869-qg4c6\" (UID: \"aefa948a-54b2-4cff-8934-2af2310b73da\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-qg4c6" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.414020 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6944ddd655-mmsrl"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.419490 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-fgt9z" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.420277 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6944ddd655-mmsrl" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.427510 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-hhxfz" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.433983 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.443787 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x26kn\" (UniqueName: \"kubernetes.io/projected/5db9c6d8-aa24-4750-9fa1-6b961574dddb-kube-api-access-x26kn\") pod \"ovn-operator-controller-manager-788c46999f-d56qf\" (UID: \"5db9c6d8-aa24-4750-9fa1-6b961574dddb\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.443828 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cts86\" (UniqueName: \"kubernetes.io/projected/a24dcaba-7ba8-45f1-9d82-6d080be373c8-kube-api-access-cts86\") pod \"octavia-operator-controller-manager-6687f8d877-2j9gv\" (UID: \"a24dcaba-7ba8-45f1-9d82-6d080be373c8\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.443867 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncjft\" (UniqueName: \"kubernetes.io/projected/185c6250-c2c7-4ff3-bfe6-6449f78269f2-kube-api-access-ncjft\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4\" (UID: \"185c6250-c2c7-4ff3-bfe6-6449f78269f2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.443957 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjdwx\" (UniqueName: \"kubernetes.io/projected/692b4d55-c298-4489-9381-73e748a0bc5d-kube-api-access-hjdwx\") pod \"placement-operator-controller-manager-5b964cf4cd-4jkf5\" (UID: \"692b4d55-c298-4489-9381-73e748a0bc5d\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.443975 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4\" (UID: \"185c6250-c2c7-4ff3-bfe6-6449f78269f2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.443992 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-662nd\" (UniqueName: \"kubernetes.io/projected/b14e0dd4-9405-4358-802c-631e838b0746-kube-api-access-662nd\") pod \"nova-operator-controller-manager-55bff696bd-wxxrr\" (UID: \"b14e0dd4-9405-4358-802c-631e838b0746\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wxxrr" Jan 31 09:41:34 crc kubenswrapper[4992]: E0131 09:41:34.445328 4992 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:41:34 crc kubenswrapper[4992]: E0131 09:41:34.445736 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert podName:185c6250-c2c7-4ff3-bfe6-6449f78269f2 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:34.945719578 +0000 UTC m=+990.917111555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" (UID: "185c6250-c2c7-4ff3-bfe6-6449f78269f2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.445796 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6944ddd655-mmsrl"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.451941 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-kqs5z" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.464496 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cts86\" (UniqueName: \"kubernetes.io/projected/a24dcaba-7ba8-45f1-9d82-6d080be373c8-kube-api-access-cts86\") pod \"octavia-operator-controller-manager-6687f8d877-2j9gv\" (UID: \"a24dcaba-7ba8-45f1-9d82-6d080be373c8\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.468054 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncjft\" (UniqueName: \"kubernetes.io/projected/185c6250-c2c7-4ff3-bfe6-6449f78269f2-kube-api-access-ncjft\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4\" (UID: \"185c6250-c2c7-4ff3-bfe6-6449f78269f2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.471844 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-qg4c6" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.471998 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x26kn\" (UniqueName: \"kubernetes.io/projected/5db9c6d8-aa24-4750-9fa1-6b961574dddb-kube-api-access-x26kn\") pod \"ovn-operator-controller-manager-788c46999f-d56qf\" (UID: \"5db9c6d8-aa24-4750-9fa1-6b961574dddb\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.474984 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-662nd\" (UniqueName: \"kubernetes.io/projected/b14e0dd4-9405-4358-802c-631e838b0746-kube-api-access-662nd\") pod \"nova-operator-controller-manager-55bff696bd-wxxrr\" (UID: \"b14e0dd4-9405-4358-802c-631e838b0746\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wxxrr" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.480175 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjdwx\" (UniqueName: \"kubernetes.io/projected/692b4d55-c298-4489-9381-73e748a0bc5d-kube-api-access-hjdwx\") pod \"placement-operator-controller-manager-5b964cf4cd-4jkf5\" (UID: \"692b4d55-c298-4489-9381-73e748a0bc5d\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.489592 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-p5bv7" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.522343 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-grr4s"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.532159 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-grr4s"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.532288 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-grr4s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.546995 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-d42xq" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.550729 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6vjt\" (UniqueName: \"kubernetes.io/projected/a0ff4139-6d2d-4a25-b0de-29a4e48e407f-kube-api-access-j6vjt\") pod \"telemetry-operator-controller-manager-64b5b76f97-khm8v\" (UID: \"a0ff4139-6d2d-4a25-b0de-29a4e48e407f\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.550808 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twqnf\" (UniqueName: \"kubernetes.io/projected/7496dda7-bc5b-4f0f-a93a-176f397fbeca-kube-api-access-twqnf\") pod \"test-operator-controller-manager-6944ddd655-mmsrl\" (UID: \"7496dda7-bc5b-4f0f-a93a-176f397fbeca\") " pod="openstack-operators/test-operator-controller-manager-6944ddd655-mmsrl" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.550845 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp4p4\" (UniqueName: \"kubernetes.io/projected/029e3183-fa79-4992-a074-61281511f268-kube-api-access-lp4p4\") pod \"watcher-operator-controller-manager-564965969-grr4s\" (UID: \"029e3183-fa79-4992-a074-61281511f268\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-grr4s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.561671 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.567862 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-js4c9" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.578722 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.585676 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.585905 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-pxx7q" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.586704 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.604824 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.659556 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twqnf\" (UniqueName: \"kubernetes.io/projected/7496dda7-bc5b-4f0f-a93a-176f397fbeca-kube-api-access-twqnf\") pod \"test-operator-controller-manager-6944ddd655-mmsrl\" (UID: \"7496dda7-bc5b-4f0f-a93a-176f397fbeca\") " pod="openstack-operators/test-operator-controller-manager-6944ddd655-mmsrl" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.659625 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp4p4\" (UniqueName: \"kubernetes.io/projected/029e3183-fa79-4992-a074-61281511f268-kube-api-access-lp4p4\") pod \"watcher-operator-controller-manager-564965969-grr4s\" (UID: \"029e3183-fa79-4992-a074-61281511f268\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-grr4s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.659733 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6vjt\" (UniqueName: \"kubernetes.io/projected/a0ff4139-6d2d-4a25-b0de-29a4e48e407f-kube-api-access-j6vjt\") pod \"telemetry-operator-controller-manager-64b5b76f97-khm8v\" (UID: \"a0ff4139-6d2d-4a25-b0de-29a4e48e407f\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.661333 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wxxrr" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.676446 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nszvc"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.679296 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.684074 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nszvc" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.687891 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-pg4sg" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.693572 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp4p4\" (UniqueName: \"kubernetes.io/projected/029e3183-fa79-4992-a074-61281511f268-kube-api-access-lp4p4\") pod \"watcher-operator-controller-manager-564965969-grr4s\" (UID: \"029e3183-fa79-4992-a074-61281511f268\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-grr4s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.703728 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nszvc"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.708240 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.713133 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twqnf\" (UniqueName: \"kubernetes.io/projected/7496dda7-bc5b-4f0f-a93a-176f397fbeca-kube-api-access-twqnf\") pod \"test-operator-controller-manager-6944ddd655-mmsrl\" (UID: \"7496dda7-bc5b-4f0f-a93a-176f397fbeca\") " pod="openstack-operators/test-operator-controller-manager-6944ddd655-mmsrl" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.761226 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.761311 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzm8n\" (UniqueName: \"kubernetes.io/projected/cbf54a49-6a1c-49ae-80f6-5beee6be7377-kube-api-access-rzm8n\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.761339 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.761466 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert\") pod \"infra-operator-controller-manager-79955696d6-98f7s\" (UID: \"d8bbd2ef-463b-430b-8332-a6f48ea54e75\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:41:34 crc kubenswrapper[4992]: E0131 09:41:34.761617 4992 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 31 09:41:34 crc kubenswrapper[4992]: E0131 09:41:34.761683 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert podName:d8bbd2ef-463b-430b-8332-a6f48ea54e75 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:35.761664972 +0000 UTC m=+991.733056959 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert") pod "infra-operator-controller-manager-79955696d6-98f7s" (UID: "d8bbd2ef-463b-430b-8332-a6f48ea54e75") : secret "infra-operator-webhook-server-cert" not found Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.764584 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.769090 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6vjt\" (UniqueName: \"kubernetes.io/projected/a0ff4139-6d2d-4a25-b0de-29a4e48e407f-kube-api-access-j6vjt\") pod \"telemetry-operator-controller-manager-64b5b76f97-khm8v\" (UID: \"a0ff4139-6d2d-4a25-b0de-29a4e48e407f\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.798965 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.809516 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6944ddd655-mmsrl" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.862623 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.862692 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzm8n\" (UniqueName: \"kubernetes.io/projected/cbf54a49-6a1c-49ae-80f6-5beee6be7377-kube-api-access-rzm8n\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.862717 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfzch\" (UniqueName: \"kubernetes.io/projected/d8a209a5-ff60-4e77-8745-300b0c1c542a-kube-api-access-lfzch\") pod \"rabbitmq-cluster-operator-manager-668c99d594-nszvc\" (UID: \"d8a209a5-ff60-4e77-8745-300b0c1c542a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nszvc" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.862741 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:34 crc kubenswrapper[4992]: E0131 09:41:34.862870 4992 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 31 09:41:34 crc kubenswrapper[4992]: E0131 09:41:34.862920 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs podName:cbf54a49-6a1c-49ae-80f6-5beee6be7377 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:35.362905032 +0000 UTC m=+991.334297019 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs") pod "openstack-operator-controller-manager-b84f98fd-qrd9s" (UID: "cbf54a49-6a1c-49ae-80f6-5beee6be7377") : secret "webhook-server-cert" not found Jan 31 09:41:34 crc kubenswrapper[4992]: E0131 09:41:34.863170 4992 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 31 09:41:34 crc kubenswrapper[4992]: E0131 09:41:34.863194 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs podName:cbf54a49-6a1c-49ae-80f6-5beee6be7377 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:35.36318747 +0000 UTC m=+991.334579457 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs") pod "openstack-operator-controller-manager-b84f98fd-qrd9s" (UID: "cbf54a49-6a1c-49ae-80f6-5beee6be7377") : secret "metrics-server-cert" not found Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.914524 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzm8n\" (UniqueName: \"kubernetes.io/projected/cbf54a49-6a1c-49ae-80f6-5beee6be7377-kube-api-access-rzm8n\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.950360 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-fjhnl"] Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.963683 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfzch\" (UniqueName: \"kubernetes.io/projected/d8a209a5-ff60-4e77-8745-300b0c1c542a-kube-api-access-lfzch\") pod \"rabbitmq-cluster-operator-manager-668c99d594-nszvc\" (UID: \"d8a209a5-ff60-4e77-8745-300b0c1c542a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nszvc" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.963731 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4\" (UID: \"185c6250-c2c7-4ff3-bfe6-6449f78269f2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:41:34 crc kubenswrapper[4992]: E0131 09:41:34.963874 4992 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:41:34 crc kubenswrapper[4992]: E0131 09:41:34.963930 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert podName:185c6250-c2c7-4ff3-bfe6-6449f78269f2 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:35.963910185 +0000 UTC m=+991.935302172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" (UID: "185c6250-c2c7-4ff3-bfe6-6449f78269f2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.969696 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-grr4s" Jan 31 09:41:34 crc kubenswrapper[4992]: I0131 09:41:34.989324 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfzch\" (UniqueName: \"kubernetes.io/projected/d8a209a5-ff60-4e77-8745-300b0c1c542a-kube-api-access-lfzch\") pod \"rabbitmq-cluster-operator-manager-668c99d594-nszvc\" (UID: \"d8a209a5-ff60-4e77-8745-300b0c1c542a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nszvc" Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.140460 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nszvc" Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.369741 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.370160 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:35 crc kubenswrapper[4992]: E0131 09:41:35.369927 4992 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 31 09:41:35 crc kubenswrapper[4992]: E0131 09:41:35.370365 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs podName:cbf54a49-6a1c-49ae-80f6-5beee6be7377 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:36.370333376 +0000 UTC m=+992.341725413 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs") pod "openstack-operator-controller-manager-b84f98fd-qrd9s" (UID: "cbf54a49-6a1c-49ae-80f6-5beee6be7377") : secret "metrics-server-cert" not found Jan 31 09:41:35 crc kubenswrapper[4992]: E0131 09:41:35.370533 4992 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 31 09:41:35 crc kubenswrapper[4992]: E0131 09:41:35.370627 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs podName:cbf54a49-6a1c-49ae-80f6-5beee6be7377 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:36.370607504 +0000 UTC m=+992.341999491 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs") pod "openstack-operator-controller-manager-b84f98fd-qrd9s" (UID: "cbf54a49-6a1c-49ae-80f6-5beee6be7377") : secret "webhook-server-cert" not found Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.397829 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-9qxn8"] Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.411262 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-xvpg7"] Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.533829 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-mgfln"] Jan 31 09:41:35 crc kubenswrapper[4992]: W0131 09:41:35.540303 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod429d0324_062a_4afe_88e7_ff0fc725b5fc.slice/crio-934c076bc98dfe30730cf95c8db525df3f0d988169660580c0201bea48795e97 WatchSource:0}: Error finding container 934c076bc98dfe30730cf95c8db525df3f0d988169660580c0201bea48795e97: Status 404 returned error can't find the container with id 934c076bc98dfe30730cf95c8db525df3f0d988169660580c0201bea48795e97 Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.551493 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-jtscq"] Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.564382 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-pqvpw"] Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.568618 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-fgt9z"] Jan 31 09:41:35 crc kubenswrapper[4992]: W0131 09:41:35.571046 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc696cc9f_2c26_4f46_8a14_fc28c65d1855.slice/crio-3b4d7877ca816f7094fb828a17bd17f4e6a3212713e885a0086d1a89303f6280 WatchSource:0}: Error finding container 3b4d7877ca816f7094fb828a17bd17f4e6a3212713e885a0086d1a89303f6280: Status 404 returned error can't find the container with id 3b4d7877ca816f7094fb828a17bd17f4e6a3212713e885a0086d1a89303f6280 Jan 31 09:41:35 crc kubenswrapper[4992]: W0131 09:41:35.571581 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8530365a_5ca4_4aae_b0cd_7f2cf9e4fb58.slice/crio-ca7546cce2e66b4cf27d2ccad1725699f18bc70b1543584ecfba142b5892114b WatchSource:0}: Error finding container ca7546cce2e66b4cf27d2ccad1725699f18bc70b1543584ecfba142b5892114b: Status 404 returned error can't find the container with id ca7546cce2e66b4cf27d2ccad1725699f18bc70b1543584ecfba142b5892114b Jan 31 09:41:35 crc kubenswrapper[4992]: W0131 09:41:35.572679 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56f8199c_cece_4c37_a1ab_87f1eae9bd83.slice/crio-b5326b4ac0606e00e103dcb612c6722d3b80ccbae213c2307ad030cb6ad280cb WatchSource:0}: Error finding container b5326b4ac0606e00e103dcb612c6722d3b80ccbae213c2307ad030cb6ad280cb: Status 404 returned error can't find the container with id b5326b4ac0606e00e103dcb612c6722d3b80ccbae213c2307ad030cb6ad280cb Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.643206 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9qxn8" event={"ID":"f03e2ba4-3389-499d-866f-8349a5a41b9e","Type":"ContainerStarted","Data":"7de77aba2a501bdf5d5dd0e348807a42a59b2bb59fa9b44aab7a81da0e4073de"} Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.644791 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-fgt9z" event={"ID":"8530365a-5ca4-4aae-b0cd-7f2cf9e4fb58","Type":"ContainerStarted","Data":"ca7546cce2e66b4cf27d2ccad1725699f18bc70b1543584ecfba142b5892114b"} Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.645850 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-fjhnl" event={"ID":"3a5af5db-f87a-4df8-bb90-81b56a48ec32","Type":"ContainerStarted","Data":"baba0fdd42e7101426cba2454a25fa1ddf821b0f5ff9515b478dd6abb8532772"} Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.646768 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-mgfln" event={"ID":"429d0324-062a-4afe-88e7-ff0fc725b5fc","Type":"ContainerStarted","Data":"934c076bc98dfe30730cf95c8db525df3f0d988169660580c0201bea48795e97"} Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.647661 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-jtscq" event={"ID":"c696cc9f-2c26-4f46-8a14-fc28c65d1855","Type":"ContainerStarted","Data":"3b4d7877ca816f7094fb828a17bd17f4e6a3212713e885a0086d1a89303f6280"} Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.648616 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-xvpg7" event={"ID":"93e6253c-64e6-458c-a400-c9587c015da2","Type":"ContainerStarted","Data":"5245b63b9c5bfb6e5d93f8b0fe17331ed77db645e964f247915edcc77e6c7ad8"} Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.649668 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-pqvpw" event={"ID":"56f8199c-cece-4c37-a1ab-87f1eae9bd83","Type":"ContainerStarted","Data":"b5326b4ac0606e00e103dcb612c6722d3b80ccbae213c2307ad030cb6ad280cb"} Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.774812 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert\") pod \"infra-operator-controller-manager-79955696d6-98f7s\" (UID: \"d8bbd2ef-463b-430b-8332-a6f48ea54e75\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:41:35 crc kubenswrapper[4992]: E0131 09:41:35.775028 4992 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 31 09:41:35 crc kubenswrapper[4992]: E0131 09:41:35.775128 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert podName:d8bbd2ef-463b-430b-8332-a6f48ea54e75 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:37.775101309 +0000 UTC m=+993.746493346 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert") pod "infra-operator-controller-manager-79955696d6-98f7s" (UID: "d8bbd2ef-463b-430b-8332-a6f48ea54e75") : secret "infra-operator-webhook-server-cert" not found Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.966702 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6944ddd655-mmsrl"] Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.973626 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-qg4c6"] Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.980656 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4\" (UID: \"185c6250-c2c7-4ff3-bfe6-6449f78269f2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:41:35 crc kubenswrapper[4992]: E0131 09:41:35.980890 4992 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:41:35 crc kubenswrapper[4992]: E0131 09:41:35.980943 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert podName:185c6250-c2c7-4ff3-bfe6-6449f78269f2 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:37.980925776 +0000 UTC m=+993.952317763 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" (UID: "185c6250-c2c7-4ff3-bfe6-6449f78269f2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:41:35 crc kubenswrapper[4992]: I0131 09:41:35.983697 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-wxxrr"] Jan 31 09:41:35 crc kubenswrapper[4992]: W0131 09:41:35.991368 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1052ce64_cfa7_4fd8_be68_849dc5cfc74f.slice/crio-3e11827d3e447834213e5d3a51a05647286f62801e0886b2072dfc06bb5b0429 WatchSource:0}: Error finding container 3e11827d3e447834213e5d3a51a05647286f62801e0886b2072dfc06bb5b0429: Status 404 returned error can't find the container with id 3e11827d3e447834213e5d3a51a05647286f62801e0886b2072dfc06bb5b0429 Jan 31 09:41:35 crc kubenswrapper[4992]: W0131 09:41:35.999885 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb14e0dd4_9405_4358_802c_631e838b0746.slice/crio-503f4c8e4941f44f54f76d4648f9fc378a3183e32f80a782cda0276bdd9fb8f7 WatchSource:0}: Error finding container 503f4c8e4941f44f54f76d4648f9fc378a3183e32f80a782cda0276bdd9fb8f7: Status 404 returned error can't find the container with id 503f4c8e4941f44f54f76d4648f9fc378a3183e32f80a782cda0276bdd9fb8f7 Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.009030 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-p5bv7"] Jan 31 09:41:36 crc kubenswrapper[4992]: W0131 09:41:36.042123 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod029e3183_fa79_4992_a074_61281511f268.slice/crio-c5bada37d16f8adbdacbd4767df213564b27e4158d3eb7d3049a4acee62d00b5 WatchSource:0}: Error finding container c5bada37d16f8adbdacbd4767df213564b27e4158d3eb7d3049a4acee62d00b5: Status 404 returned error can't find the container with id c5bada37d16f8adbdacbd4767df213564b27e4158d3eb7d3049a4acee62d00b5 Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.048293 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6nphs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-84f48565d4-w2697_openstack-operators(99af6e66-0ebf-4bb3-a943-d106e624df65): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 31 09:41:36 crc kubenswrapper[4992]: W0131 09:41:36.048877 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8a209a5_ff60_4e77_8745_300b0c1c542a.slice/crio-f046da03c22b57afc9c004fdcc0f2bf5d12b456d4052c816f56abc734c5c7a06 WatchSource:0}: Error finding container f046da03c22b57afc9c004fdcc0f2bf5d12b456d4052c816f56abc734c5c7a06: Status 404 returned error can't find the container with id f046da03c22b57afc9c004fdcc0f2bf5d12b456d4052c816f56abc734c5c7a06 Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.049387 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697" podUID="99af6e66-0ebf-4bb3-a943-d106e624df65" Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.050103 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-js4c9"] Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.050451 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lfzch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-nszvc_openstack-operators(d8a209a5-ff60-4e77-8745-300b0c1c542a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.051831 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nszvc" podUID="d8a209a5-ff60-4e77-8745-300b0c1c542a" Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.079925 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-kqs5z"] Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.100565 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hjdwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b964cf4cd-4jkf5_openstack-operators(692b4d55-c298-4489-9381-73e748a0bc5d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.100969 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j6vjt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-64b5b76f97-khm8v_openstack-operators(a0ff4139-6d2d-4a25-b0de-29a4e48e407f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.101094 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x26kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-788c46999f-d56qf_openstack-operators(5db9c6d8-aa24-4750-9fa1-6b961574dddb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.101676 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cts86,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6687f8d877-2j9gv_openstack-operators(a24dcaba-7ba8-45f1-9d82-6d080be373c8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.101769 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5" podUID="692b4d55-c298-4489-9381-73e748a0bc5d" Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.102231 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697"] Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.102560 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v" podUID="a0ff4139-6d2d-4a25-b0de-29a4e48e407f" Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.103024 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv" podUID="a24dcaba-7ba8-45f1-9d82-6d080be373c8" Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.103503 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf" podUID="5db9c6d8-aa24-4750-9fa1-6b961574dddb" Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.123217 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5"] Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.129290 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-grr4s"] Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.139044 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v"] Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.147070 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nszvc"] Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.153025 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv"] Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.158752 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf"] Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.385745 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.385824 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.386031 4992 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.386126 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs podName:cbf54a49-6a1c-49ae-80f6-5beee6be7377 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:38.38607304 +0000 UTC m=+994.357465027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs") pod "openstack-operator-controller-manager-b84f98fd-qrd9s" (UID: "cbf54a49-6a1c-49ae-80f6-5beee6be7377") : secret "webhook-server-cert" not found Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.387163 4992 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.387210 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs podName:cbf54a49-6a1c-49ae-80f6-5beee6be7377 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:38.387197962 +0000 UTC m=+994.358589949 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs") pod "openstack-operator-controller-manager-b84f98fd-qrd9s" (UID: "cbf54a49-6a1c-49ae-80f6-5beee6be7377") : secret "metrics-server-cert" not found Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.658813 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf" event={"ID":"5db9c6d8-aa24-4750-9fa1-6b961574dddb","Type":"ContainerStarted","Data":"469642bfb4470b1fd7bd40f1eac501134ad0d3d35be4b600b5b96f0d32a401c6"} Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.661185 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf" podUID="5db9c6d8-aa24-4750-9fa1-6b961574dddb" Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.662278 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-kqs5z" event={"ID":"f74c9115-a924-4896-a04d-4f523eadafa7","Type":"ContainerStarted","Data":"0ddd8f3dcffca440442ccd651ad22136b639fd15464db8c2ea05e05e561d1336"} Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.663863 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wxxrr" event={"ID":"b14e0dd4-9405-4358-802c-631e838b0746","Type":"ContainerStarted","Data":"503f4c8e4941f44f54f76d4648f9fc378a3183e32f80a782cda0276bdd9fb8f7"} Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.666200 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-js4c9" event={"ID":"1052ce64-cfa7-4fd8-be68-849dc5cfc74f","Type":"ContainerStarted","Data":"3e11827d3e447834213e5d3a51a05647286f62801e0886b2072dfc06bb5b0429"} Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.667854 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-grr4s" event={"ID":"029e3183-fa79-4992-a074-61281511f268","Type":"ContainerStarted","Data":"c5bada37d16f8adbdacbd4767df213564b27e4158d3eb7d3049a4acee62d00b5"} Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.669922 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nszvc" event={"ID":"d8a209a5-ff60-4e77-8745-300b0c1c542a","Type":"ContainerStarted","Data":"f046da03c22b57afc9c004fdcc0f2bf5d12b456d4052c816f56abc734c5c7a06"} Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.671799 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nszvc" podUID="d8a209a5-ff60-4e77-8745-300b0c1c542a" Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.681015 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v" event={"ID":"a0ff4139-6d2d-4a25-b0de-29a4e48e407f","Type":"ContainerStarted","Data":"515afcd452c57b5fb5168998f954df1776e3925acf1f54c4ccb5fadc35733a53"} Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.682491 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697" event={"ID":"99af6e66-0ebf-4bb3-a943-d106e624df65","Type":"ContainerStarted","Data":"22fe21d70b8a439dd58dd90db4282a519e7ea98a6068404854f7c5339ac39b1b"} Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.682599 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v" podUID="a0ff4139-6d2d-4a25-b0de-29a4e48e407f" Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.683640 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697" podUID="99af6e66-0ebf-4bb3-a943-d106e624df65" Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.683768 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-qg4c6" event={"ID":"aefa948a-54b2-4cff-8934-2af2310b73da","Type":"ContainerStarted","Data":"27f773ea3ca43aacc8536d7532a88f61a265d95a45a432cd26b5e9b57e5cad13"} Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.685769 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5" event={"ID":"692b4d55-c298-4489-9381-73e748a0bc5d","Type":"ContainerStarted","Data":"17aacc2e2f7ceccc091bf450c6494c0650bfe83ccdc5879639643e4e37b30f67"} Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.687328 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5" podUID="692b4d55-c298-4489-9381-73e748a0bc5d" Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.689092 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-p5bv7" event={"ID":"0237347b-28b1-42cb-b0c3-3d0cdb660846","Type":"ContainerStarted","Data":"85debb228ef5ef5e462d382db69ba324ed3a17d4c2ae089641e6ad49190b6d28"} Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.693775 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6944ddd655-mmsrl" event={"ID":"7496dda7-bc5b-4f0f-a93a-176f397fbeca","Type":"ContainerStarted","Data":"818cd7033ca60cb99de4b7fb8370807314457be88dac183bd8d196b9e7954adf"} Jan 31 09:41:36 crc kubenswrapper[4992]: I0131 09:41:36.706078 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv" event={"ID":"a24dcaba-7ba8-45f1-9d82-6d080be373c8","Type":"ContainerStarted","Data":"78d86c908732e34139530d3febaa409038a0fa02813410be9109a0168e04e2ef"} Jan 31 09:41:36 crc kubenswrapper[4992]: E0131 09:41:36.721467 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv" podUID="a24dcaba-7ba8-45f1-9d82-6d080be373c8" Jan 31 09:41:37 crc kubenswrapper[4992]: E0131 09:41:37.718217 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv" podUID="a24dcaba-7ba8-45f1-9d82-6d080be373c8" Jan 31 09:41:37 crc kubenswrapper[4992]: E0131 09:41:37.718828 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:ea7b72b648a5bde2eebd804c2a5c1608d448a4892176c1b8d000c1eef4bb92b4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf" podUID="5db9c6d8-aa24-4750-9fa1-6b961574dddb" Jan 31 09:41:37 crc kubenswrapper[4992]: E0131 09:41:37.718986 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v" podUID="a0ff4139-6d2d-4a25-b0de-29a4e48e407f" Jan 31 09:41:37 crc kubenswrapper[4992]: E0131 09:41:37.719025 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5" podUID="692b4d55-c298-4489-9381-73e748a0bc5d" Jan 31 09:41:37 crc kubenswrapper[4992]: E0131 09:41:37.719058 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697" podUID="99af6e66-0ebf-4bb3-a943-d106e624df65" Jan 31 09:41:37 crc kubenswrapper[4992]: E0131 09:41:37.719091 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nszvc" podUID="d8a209a5-ff60-4e77-8745-300b0c1c542a" Jan 31 09:41:37 crc kubenswrapper[4992]: I0131 09:41:37.813123 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert\") pod \"infra-operator-controller-manager-79955696d6-98f7s\" (UID: \"d8bbd2ef-463b-430b-8332-a6f48ea54e75\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:41:37 crc kubenswrapper[4992]: E0131 09:41:37.813947 4992 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 31 09:41:37 crc kubenswrapper[4992]: E0131 09:41:37.813987 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert podName:d8bbd2ef-463b-430b-8332-a6f48ea54e75 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:41.81397349 +0000 UTC m=+997.785365477 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert") pod "infra-operator-controller-manager-79955696d6-98f7s" (UID: "d8bbd2ef-463b-430b-8332-a6f48ea54e75") : secret "infra-operator-webhook-server-cert" not found Jan 31 09:41:38 crc kubenswrapper[4992]: I0131 09:41:38.015324 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4\" (UID: \"185c6250-c2c7-4ff3-bfe6-6449f78269f2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:41:38 crc kubenswrapper[4992]: E0131 09:41:38.015570 4992 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:41:38 crc kubenswrapper[4992]: E0131 09:41:38.015673 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert podName:185c6250-c2c7-4ff3-bfe6-6449f78269f2 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:42.015653627 +0000 UTC m=+997.987045614 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" (UID: "185c6250-c2c7-4ff3-bfe6-6449f78269f2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:41:38 crc kubenswrapper[4992]: I0131 09:41:38.421457 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:38 crc kubenswrapper[4992]: I0131 09:41:38.421533 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:38 crc kubenswrapper[4992]: E0131 09:41:38.421706 4992 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 31 09:41:38 crc kubenswrapper[4992]: E0131 09:41:38.421759 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs podName:cbf54a49-6a1c-49ae-80f6-5beee6be7377 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:42.421743308 +0000 UTC m=+998.393135305 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs") pod "openstack-operator-controller-manager-b84f98fd-qrd9s" (UID: "cbf54a49-6a1c-49ae-80f6-5beee6be7377") : secret "webhook-server-cert" not found Jan 31 09:41:38 crc kubenswrapper[4992]: E0131 09:41:38.422127 4992 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 31 09:41:38 crc kubenswrapper[4992]: E0131 09:41:38.422158 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs podName:cbf54a49-6a1c-49ae-80f6-5beee6be7377 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:42.42215149 +0000 UTC m=+998.393543477 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs") pod "openstack-operator-controller-manager-b84f98fd-qrd9s" (UID: "cbf54a49-6a1c-49ae-80f6-5beee6be7377") : secret "metrics-server-cert" not found Jan 31 09:41:41 crc kubenswrapper[4992]: I0131 09:41:41.878367 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert\") pod \"infra-operator-controller-manager-79955696d6-98f7s\" (UID: \"d8bbd2ef-463b-430b-8332-a6f48ea54e75\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:41:41 crc kubenswrapper[4992]: E0131 09:41:41.878657 4992 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 31 09:41:41 crc kubenswrapper[4992]: E0131 09:41:41.878759 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert podName:d8bbd2ef-463b-430b-8332-a6f48ea54e75 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:49.878740027 +0000 UTC m=+1005.850132014 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert") pod "infra-operator-controller-manager-79955696d6-98f7s" (UID: "d8bbd2ef-463b-430b-8332-a6f48ea54e75") : secret "infra-operator-webhook-server-cert" not found Jan 31 09:41:42 crc kubenswrapper[4992]: I0131 09:41:42.081196 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4\" (UID: \"185c6250-c2c7-4ff3-bfe6-6449f78269f2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:41:42 crc kubenswrapper[4992]: E0131 09:41:42.081458 4992 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:41:42 crc kubenswrapper[4992]: E0131 09:41:42.081565 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert podName:185c6250-c2c7-4ff3-bfe6-6449f78269f2 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:50.081521295 +0000 UTC m=+1006.052913282 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" (UID: "185c6250-c2c7-4ff3-bfe6-6449f78269f2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:41:42 crc kubenswrapper[4992]: I0131 09:41:42.494096 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:42 crc kubenswrapper[4992]: I0131 09:41:42.494181 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:42 crc kubenswrapper[4992]: E0131 09:41:42.494372 4992 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 31 09:41:42 crc kubenswrapper[4992]: E0131 09:41:42.494447 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs podName:cbf54a49-6a1c-49ae-80f6-5beee6be7377 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:50.494411324 +0000 UTC m=+1006.465803321 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs") pod "openstack-operator-controller-manager-b84f98fd-qrd9s" (UID: "cbf54a49-6a1c-49ae-80f6-5beee6be7377") : secret "webhook-server-cert" not found Jan 31 09:41:42 crc kubenswrapper[4992]: E0131 09:41:42.494822 4992 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 31 09:41:42 crc kubenswrapper[4992]: E0131 09:41:42.494857 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs podName:cbf54a49-6a1c-49ae-80f6-5beee6be7377 nodeName:}" failed. No retries permitted until 2026-01-31 09:41:50.494846787 +0000 UTC m=+1006.466238774 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs") pod "openstack-operator-controller-manager-b84f98fd-qrd9s" (UID: "cbf54a49-6a1c-49ae-80f6-5beee6be7377") : secret "metrics-server-cert" not found Jan 31 09:41:45 crc kubenswrapper[4992]: I0131 09:41:45.301561 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:41:45 crc kubenswrapper[4992]: I0131 09:41:45.302699 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:41:47 crc kubenswrapper[4992]: E0131 09:41:47.507037 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.147:5001/openstack-k8s-operators/test-operator:14e721e93cdf96c517e875d18ced64f1b68ad193" Jan 31 09:41:47 crc kubenswrapper[4992]: E0131 09:41:47.507327 4992 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.147:5001/openstack-k8s-operators/test-operator:14e721e93cdf96c517e875d18ced64f1b68ad193" Jan 31 09:41:47 crc kubenswrapper[4992]: E0131 09:41:47.507489 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.147:5001/openstack-k8s-operators/test-operator:14e721e93cdf96c517e875d18ced64f1b68ad193,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-twqnf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-6944ddd655-mmsrl_openstack-operators(7496dda7-bc5b-4f0f-a93a-176f397fbeca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 09:41:47 crc kubenswrapper[4992]: E0131 09:41:47.509003 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-6944ddd655-mmsrl" podUID="7496dda7-bc5b-4f0f-a93a-176f397fbeca" Jan 31 09:41:47 crc kubenswrapper[4992]: E0131 09:41:47.776028 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.147:5001/openstack-k8s-operators/test-operator:14e721e93cdf96c517e875d18ced64f1b68ad193\\\"\"" pod="openstack-operators/test-operator-controller-manager-6944ddd655-mmsrl" podUID="7496dda7-bc5b-4f0f-a93a-176f397fbeca" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.781309 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-grr4s" event={"ID":"029e3183-fa79-4992-a074-61281511f268","Type":"ContainerStarted","Data":"6e74de83ea0dd4110a00bc1e99f7529a0aa4684dc8304fef8bca8c66c7d182d7"} Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.781664 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-grr4s" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.782756 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-xvpg7" event={"ID":"93e6253c-64e6-458c-a400-c9587c015da2","Type":"ContainerStarted","Data":"ef6fd97b26c4834919dfc12d0ccb9189d184498aa86f8bc8af141d0c90172199"} Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.782906 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-xvpg7" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.783984 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wxxrr" event={"ID":"b14e0dd4-9405-4358-802c-631e838b0746","Type":"ContainerStarted","Data":"943fcae51c4a450a977f0a9bb03b464053047cd5dce1c24b777f1b9a1765851a"} Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.784068 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wxxrr" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.785641 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-mgfln" event={"ID":"429d0324-062a-4afe-88e7-ff0fc725b5fc","Type":"ContainerStarted","Data":"ddcb86ddc4d0ff55ec984a6ef36d294b705c843efefcd312bccb598e7fb7e190"} Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.785691 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-mgfln" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.787186 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-qg4c6" event={"ID":"aefa948a-54b2-4cff-8934-2af2310b73da","Type":"ContainerStarted","Data":"f95631027ca87009832273acf85bc44beddb8b89565405acbac7ddb39f481eed"} Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.787286 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-qg4c6" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.788527 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-jtscq" event={"ID":"c696cc9f-2c26-4f46-8a14-fc28c65d1855","Type":"ContainerStarted","Data":"0bb12db507cafbeb1282b0abe772ad557f534542ab25ee35ac0eab49af01eca0"} Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.788649 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-jtscq" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.789611 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-p5bv7" event={"ID":"0237347b-28b1-42cb-b0c3-3d0cdb660846","Type":"ContainerStarted","Data":"bbcda70ebcec221888d962ca93d83567683f39c843725c80ed89ac817064a03a"} Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.789701 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-p5bv7" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.790847 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-kqs5z" event={"ID":"f74c9115-a924-4896-a04d-4f523eadafa7","Type":"ContainerStarted","Data":"a2a0c2228db7bec8d14d63c3b05acf1e9452634ff50672321b80783c780ae8e3"} Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.791197 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-kqs5z" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.792328 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-fjhnl" event={"ID":"3a5af5db-f87a-4df8-bb90-81b56a48ec32","Type":"ContainerStarted","Data":"4f9dc32d3cc7f48ade5f1a7504d0a397bba8949eb013e46a22badb00ffecd574"} Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.792682 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-fjhnl" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.794815 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-js4c9" event={"ID":"1052ce64-cfa7-4fd8-be68-849dc5cfc74f","Type":"ContainerStarted","Data":"ab0b197f938a05725cc01a8e5d80010d8cc01fbaab6e5d0a2b6d0f12e5f98c13"} Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.795189 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-js4c9" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.796314 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-pqvpw" event={"ID":"56f8199c-cece-4c37-a1ab-87f1eae9bd83","Type":"ContainerStarted","Data":"01e1531ff55d278109bdce26a512cec1d53e8df2a07daf93644d6739702f2331"} Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.796660 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-pqvpw" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.797843 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9qxn8" event={"ID":"f03e2ba4-3389-499d-866f-8349a5a41b9e","Type":"ContainerStarted","Data":"5f14c35ce70aa1f42288010c1a7e313d8287eaa423e5b63fad77f2de0f6890b3"} Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.798198 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9qxn8" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.799594 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-fgt9z" event={"ID":"8530365a-5ca4-4aae-b0cd-7f2cf9e4fb58","Type":"ContainerStarted","Data":"f806415f99ab5e8670c3bc5b87c4f5f633f1ab260cc04bf390512a91218be3c7"} Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.800199 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-fgt9z" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.891317 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-grr4s" podStartSLOduration=2.9211833179999998 podStartE2EDuration="14.891298077s" podCreationTimestamp="2026-01-31 09:41:34 +0000 UTC" firstStartedPulling="2026-01-31 09:41:36.044172652 +0000 UTC m=+992.015564639" lastFinishedPulling="2026-01-31 09:41:48.014287421 +0000 UTC m=+1003.985679398" observedRunningTime="2026-01-31 09:41:48.819663677 +0000 UTC m=+1004.791055684" watchObservedRunningTime="2026-01-31 09:41:48.891298077 +0000 UTC m=+1004.862690064" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.894365 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-xvpg7" podStartSLOduration=3.252101129 podStartE2EDuration="15.894318504s" podCreationTimestamp="2026-01-31 09:41:33 +0000 UTC" firstStartedPulling="2026-01-31 09:41:35.415396814 +0000 UTC m=+991.386788801" lastFinishedPulling="2026-01-31 09:41:48.057614189 +0000 UTC m=+1004.029006176" observedRunningTime="2026-01-31 09:41:48.891933455 +0000 UTC m=+1004.863325442" watchObservedRunningTime="2026-01-31 09:41:48.894318504 +0000 UTC m=+1004.865710491" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.940561 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-fgt9z" podStartSLOduration=3.48213359 podStartE2EDuration="15.940543787s" podCreationTimestamp="2026-01-31 09:41:33 +0000 UTC" firstStartedPulling="2026-01-31 09:41:35.574171295 +0000 UTC m=+991.545563282" lastFinishedPulling="2026-01-31 09:41:48.032581492 +0000 UTC m=+1004.003973479" observedRunningTime="2026-01-31 09:41:48.929080014 +0000 UTC m=+1004.900472031" watchObservedRunningTime="2026-01-31 09:41:48.940543787 +0000 UTC m=+1004.911935764" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.988580 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9qxn8" podStartSLOduration=3.904154683 podStartE2EDuration="15.988566441s" podCreationTimestamp="2026-01-31 09:41:33 +0000 UTC" firstStartedPulling="2026-01-31 09:41:35.410527013 +0000 UTC m=+991.381919000" lastFinishedPulling="2026-01-31 09:41:47.494938771 +0000 UTC m=+1003.466330758" observedRunningTime="2026-01-31 09:41:48.958674463 +0000 UTC m=+1004.930066470" watchObservedRunningTime="2026-01-31 09:41:48.988566441 +0000 UTC m=+1004.959958428" Jan 31 09:41:48 crc kubenswrapper[4992]: I0131 09:41:48.989311 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-js4c9" podStartSLOduration=3.9521396859999998 podStartE2EDuration="15.989306352s" podCreationTimestamp="2026-01-31 09:41:33 +0000 UTC" firstStartedPulling="2026-01-31 09:41:36.019825725 +0000 UTC m=+991.991217712" lastFinishedPulling="2026-01-31 09:41:48.056992391 +0000 UTC m=+1004.028384378" observedRunningTime="2026-01-31 09:41:48.986873322 +0000 UTC m=+1004.958265309" watchObservedRunningTime="2026-01-31 09:41:48.989306352 +0000 UTC m=+1004.960698329" Jan 31 09:41:49 crc kubenswrapper[4992]: I0131 09:41:49.027253 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-mgfln" podStartSLOduration=3.54622812 podStartE2EDuration="16.027233254s" podCreationTimestamp="2026-01-31 09:41:33 +0000 UTC" firstStartedPulling="2026-01-31 09:41:35.54507354 +0000 UTC m=+991.516465527" lastFinishedPulling="2026-01-31 09:41:48.026078684 +0000 UTC m=+1003.997470661" observedRunningTime="2026-01-31 09:41:49.022435604 +0000 UTC m=+1004.993827601" watchObservedRunningTime="2026-01-31 09:41:49.027233254 +0000 UTC m=+1004.998625241" Jan 31 09:41:49 crc kubenswrapper[4992]: I0131 09:41:49.053641 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-qg4c6" podStartSLOduration=3.006362481 podStartE2EDuration="15.05362551s" podCreationTimestamp="2026-01-31 09:41:34 +0000 UTC" firstStartedPulling="2026-01-31 09:41:35.97866551 +0000 UTC m=+991.950057497" lastFinishedPulling="2026-01-31 09:41:48.025928549 +0000 UTC m=+1003.997320526" observedRunningTime="2026-01-31 09:41:49.050115058 +0000 UTC m=+1005.021507055" watchObservedRunningTime="2026-01-31 09:41:49.05362551 +0000 UTC m=+1005.025017497" Jan 31 09:41:49 crc kubenswrapper[4992]: I0131 09:41:49.078432 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-fjhnl" podStartSLOduration=3.0885379 podStartE2EDuration="16.078396269s" podCreationTimestamp="2026-01-31 09:41:33 +0000 UTC" firstStartedPulling="2026-01-31 09:41:35.024594817 +0000 UTC m=+990.995986804" lastFinishedPulling="2026-01-31 09:41:48.014453166 +0000 UTC m=+1003.985845173" observedRunningTime="2026-01-31 09:41:49.072121157 +0000 UTC m=+1005.043513144" watchObservedRunningTime="2026-01-31 09:41:49.078396269 +0000 UTC m=+1005.049788256" Jan 31 09:41:49 crc kubenswrapper[4992]: I0131 09:41:49.133993 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-pqvpw" podStartSLOduration=2.693517247 podStartE2EDuration="15.133974263s" podCreationTimestamp="2026-01-31 09:41:34 +0000 UTC" firstStartedPulling="2026-01-31 09:41:35.574758952 +0000 UTC m=+991.546150939" lastFinishedPulling="2026-01-31 09:41:48.015215958 +0000 UTC m=+1003.986607955" observedRunningTime="2026-01-31 09:41:49.09906272 +0000 UTC m=+1005.070454727" watchObservedRunningTime="2026-01-31 09:41:49.133974263 +0000 UTC m=+1005.105366250" Jan 31 09:41:49 crc kubenswrapper[4992]: I0131 09:41:49.134360 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-kqs5z" podStartSLOduration=4.110116103 podStartE2EDuration="16.134356474s" podCreationTimestamp="2026-01-31 09:41:33 +0000 UTC" firstStartedPulling="2026-01-31 09:41:35.991861733 +0000 UTC m=+991.963253720" lastFinishedPulling="2026-01-31 09:41:48.016102104 +0000 UTC m=+1003.987494091" observedRunningTime="2026-01-31 09:41:49.132872741 +0000 UTC m=+1005.104264738" watchObservedRunningTime="2026-01-31 09:41:49.134356474 +0000 UTC m=+1005.105748461" Jan 31 09:41:49 crc kubenswrapper[4992]: I0131 09:41:49.157560 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-p5bv7" podStartSLOduration=4.151837524 podStartE2EDuration="16.157535577s" podCreationTimestamp="2026-01-31 09:41:33 +0000 UTC" firstStartedPulling="2026-01-31 09:41:36.020282998 +0000 UTC m=+991.991674995" lastFinishedPulling="2026-01-31 09:41:48.025981071 +0000 UTC m=+1003.997373048" observedRunningTime="2026-01-31 09:41:49.151820291 +0000 UTC m=+1005.123212278" watchObservedRunningTime="2026-01-31 09:41:49.157535577 +0000 UTC m=+1005.128927574" Jan 31 09:41:49 crc kubenswrapper[4992]: I0131 09:41:49.185509 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-jtscq" podStartSLOduration=3.743988603 podStartE2EDuration="16.185490409s" podCreationTimestamp="2026-01-31 09:41:33 +0000 UTC" firstStartedPulling="2026-01-31 09:41:35.573725092 +0000 UTC m=+991.545117079" lastFinishedPulling="2026-01-31 09:41:48.015226888 +0000 UTC m=+1003.986618885" observedRunningTime="2026-01-31 09:41:49.184564512 +0000 UTC m=+1005.155956519" watchObservedRunningTime="2026-01-31 09:41:49.185490409 +0000 UTC m=+1005.156882396" Jan 31 09:41:49 crc kubenswrapper[4992]: I0131 09:41:49.207795 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wxxrr" podStartSLOduration=3.138528079 podStartE2EDuration="15.207778316s" podCreationTimestamp="2026-01-31 09:41:34 +0000 UTC" firstStartedPulling="2026-01-31 09:41:36.019161296 +0000 UTC m=+991.990553283" lastFinishedPulling="2026-01-31 09:41:48.088411493 +0000 UTC m=+1004.059803520" observedRunningTime="2026-01-31 09:41:49.206494469 +0000 UTC m=+1005.177886476" watchObservedRunningTime="2026-01-31 09:41:49.207778316 +0000 UTC m=+1005.179170303" Jan 31 09:41:49 crc kubenswrapper[4992]: I0131 09:41:49.899712 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert\") pod \"infra-operator-controller-manager-79955696d6-98f7s\" (UID: \"d8bbd2ef-463b-430b-8332-a6f48ea54e75\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:41:49 crc kubenswrapper[4992]: E0131 09:41:49.899794 4992 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 31 09:41:49 crc kubenswrapper[4992]: E0131 09:41:49.899862 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert podName:d8bbd2ef-463b-430b-8332-a6f48ea54e75 nodeName:}" failed. No retries permitted until 2026-01-31 09:42:05.899842542 +0000 UTC m=+1021.871234639 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert") pod "infra-operator-controller-manager-79955696d6-98f7s" (UID: "d8bbd2ef-463b-430b-8332-a6f48ea54e75") : secret "infra-operator-webhook-server-cert" not found Jan 31 09:41:50 crc kubenswrapper[4992]: I0131 09:41:50.102677 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4\" (UID: \"185c6250-c2c7-4ff3-bfe6-6449f78269f2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:41:50 crc kubenswrapper[4992]: E0131 09:41:50.102937 4992 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:41:50 crc kubenswrapper[4992]: E0131 09:41:50.103027 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert podName:185c6250-c2c7-4ff3-bfe6-6449f78269f2 nodeName:}" failed. No retries permitted until 2026-01-31 09:42:06.103004661 +0000 UTC m=+1022.074396728 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" (UID: "185c6250-c2c7-4ff3-bfe6-6449f78269f2") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 31 09:41:50 crc kubenswrapper[4992]: I0131 09:41:50.506294 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:50 crc kubenswrapper[4992]: I0131 09:41:50.506559 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:41:50 crc kubenswrapper[4992]: E0131 09:41:50.506506 4992 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 31 09:41:50 crc kubenswrapper[4992]: E0131 09:41:50.506679 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs podName:cbf54a49-6a1c-49ae-80f6-5beee6be7377 nodeName:}" failed. No retries permitted until 2026-01-31 09:42:06.506653071 +0000 UTC m=+1022.478045058 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs") pod "openstack-operator-controller-manager-b84f98fd-qrd9s" (UID: "cbf54a49-6a1c-49ae-80f6-5beee6be7377") : secret "metrics-server-cert" not found Jan 31 09:41:50 crc kubenswrapper[4992]: E0131 09:41:50.506713 4992 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 31 09:41:50 crc kubenswrapper[4992]: E0131 09:41:50.506759 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs podName:cbf54a49-6a1c-49ae-80f6-5beee6be7377 nodeName:}" failed. No retries permitted until 2026-01-31 09:42:06.506745514 +0000 UTC m=+1022.478137501 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs") pod "openstack-operator-controller-manager-b84f98fd-qrd9s" (UID: "cbf54a49-6a1c-49ae-80f6-5beee6be7377") : secret "webhook-server-cert" not found Jan 31 09:41:54 crc kubenswrapper[4992]: I0131 09:41:54.153936 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-xvpg7" Jan 31 09:41:54 crc kubenswrapper[4992]: I0131 09:41:54.172780 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-fjhnl" Jan 31 09:41:54 crc kubenswrapper[4992]: I0131 09:41:54.266777 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9qxn8" Jan 31 09:41:54 crc kubenswrapper[4992]: I0131 09:41:54.278945 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-mgfln" Jan 31 09:41:54 crc kubenswrapper[4992]: I0131 09:41:54.300965 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-jtscq" Jan 31 09:41:54 crc kubenswrapper[4992]: I0131 09:41:54.315864 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-pqvpw" Jan 31 09:41:54 crc kubenswrapper[4992]: I0131 09:41:54.422315 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-fgt9z" Jan 31 09:41:54 crc kubenswrapper[4992]: I0131 09:41:54.490753 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-qg4c6" Jan 31 09:41:54 crc kubenswrapper[4992]: I0131 09:41:54.492705 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-p5bv7" Jan 31 09:41:54 crc kubenswrapper[4992]: I0131 09:41:54.493292 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-kqs5z" Jan 31 09:41:54 crc kubenswrapper[4992]: I0131 09:41:54.572223 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-js4c9" Jan 31 09:41:54 crc kubenswrapper[4992]: I0131 09:41:54.665891 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wxxrr" Jan 31 09:41:54 crc kubenswrapper[4992]: I0131 09:41:54.972722 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-grr4s" Jan 31 09:41:58 crc kubenswrapper[4992]: I0131 09:41:58.878204 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697" event={"ID":"99af6e66-0ebf-4bb3-a943-d106e624df65","Type":"ContainerStarted","Data":"461fc84ac05bc5c02baea3b3a8307f6761107ed4dce866ceaaf79db682ed6ac0"} Jan 31 09:41:58 crc kubenswrapper[4992]: I0131 09:41:58.879013 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697" Jan 31 09:41:58 crc kubenswrapper[4992]: I0131 09:41:58.881776 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5" event={"ID":"692b4d55-c298-4489-9381-73e748a0bc5d","Type":"ContainerStarted","Data":"8bb1232aacf4f2e4fcd5179e3f53f877a2b48080073febe170294d38c28d490b"} Jan 31 09:41:58 crc kubenswrapper[4992]: I0131 09:41:58.882052 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5" Jan 31 09:41:58 crc kubenswrapper[4992]: I0131 09:41:58.890510 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v" event={"ID":"a0ff4139-6d2d-4a25-b0de-29a4e48e407f","Type":"ContainerStarted","Data":"309703d9b6197191d0107d71d017f1450c64bc55ab385db15083eb8ca3b78cf9"} Jan 31 09:41:58 crc kubenswrapper[4992]: I0131 09:41:58.891652 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v" Jan 31 09:41:58 crc kubenswrapper[4992]: I0131 09:41:58.903868 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697" podStartSLOduration=3.536011314 podStartE2EDuration="25.903842944s" podCreationTimestamp="2026-01-31 09:41:33 +0000 UTC" firstStartedPulling="2026-01-31 09:41:36.048118077 +0000 UTC m=+992.019510074" lastFinishedPulling="2026-01-31 09:41:58.415949707 +0000 UTC m=+1014.387341704" observedRunningTime="2026-01-31 09:41:58.899563489 +0000 UTC m=+1014.870955476" watchObservedRunningTime="2026-01-31 09:41:58.903842944 +0000 UTC m=+1014.875234931" Jan 31 09:41:58 crc kubenswrapper[4992]: I0131 09:41:58.908758 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf" event={"ID":"5db9c6d8-aa24-4750-9fa1-6b961574dddb","Type":"ContainerStarted","Data":"929a3d583d3b3d8431b747d94d5bd223883c5e8d4ce494485de1ea529d632214"} Jan 31 09:41:58 crc kubenswrapper[4992]: I0131 09:41:58.909510 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf" Jan 31 09:41:58 crc kubenswrapper[4992]: I0131 09:41:58.922610 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5" podStartSLOduration=2.479912505 podStartE2EDuration="24.922592398s" podCreationTimestamp="2026-01-31 09:41:34 +0000 UTC" firstStartedPulling="2026-01-31 09:41:36.100344513 +0000 UTC m=+992.071736500" lastFinishedPulling="2026-01-31 09:41:58.543024406 +0000 UTC m=+1014.514416393" observedRunningTime="2026-01-31 09:41:58.920227669 +0000 UTC m=+1014.891619676" watchObservedRunningTime="2026-01-31 09:41:58.922592398 +0000 UTC m=+1014.893984385" Jan 31 09:41:58 crc kubenswrapper[4992]: I0131 09:41:58.943062 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v" podStartSLOduration=2.502306344 podStartE2EDuration="24.943044052s" podCreationTimestamp="2026-01-31 09:41:34 +0000 UTC" firstStartedPulling="2026-01-31 09:41:36.100875848 +0000 UTC m=+992.072267835" lastFinishedPulling="2026-01-31 09:41:58.541613536 +0000 UTC m=+1014.513005543" observedRunningTime="2026-01-31 09:41:58.936297296 +0000 UTC m=+1014.907689293" watchObservedRunningTime="2026-01-31 09:41:58.943044052 +0000 UTC m=+1014.914436039" Jan 31 09:41:58 crc kubenswrapper[4992]: I0131 09:41:58.953099 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf" podStartSLOduration=2.506217348 podStartE2EDuration="24.953077173s" podCreationTimestamp="2026-01-31 09:41:34 +0000 UTC" firstStartedPulling="2026-01-31 09:41:36.101024403 +0000 UTC m=+992.072416390" lastFinishedPulling="2026-01-31 09:41:58.547884228 +0000 UTC m=+1014.519276215" observedRunningTime="2026-01-31 09:41:58.951211859 +0000 UTC m=+1014.922603856" watchObservedRunningTime="2026-01-31 09:41:58.953077173 +0000 UTC m=+1014.924469160" Jan 31 09:41:59 crc kubenswrapper[4992]: I0131 09:41:59.916348 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv" event={"ID":"a24dcaba-7ba8-45f1-9d82-6d080be373c8","Type":"ContainerStarted","Data":"27a2386f39ae873826801f27c2f384e1692003a052e645fcf6d6a82cbebb9d08"} Jan 31 09:41:59 crc kubenswrapper[4992]: I0131 09:41:59.916597 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv" Jan 31 09:41:59 crc kubenswrapper[4992]: I0131 09:41:59.918920 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nszvc" event={"ID":"d8a209a5-ff60-4e77-8745-300b0c1c542a","Type":"ContainerStarted","Data":"dabf1947c5dcf2bb9570fc137bb2bd801c49c91c727748218d960b4fd8f79695"} Jan 31 09:41:59 crc kubenswrapper[4992]: I0131 09:41:59.934507 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv" podStartSLOduration=3.468145569 podStartE2EDuration="25.93448521s" podCreationTimestamp="2026-01-31 09:41:34 +0000 UTC" firstStartedPulling="2026-01-31 09:41:36.101520537 +0000 UTC m=+992.072912524" lastFinishedPulling="2026-01-31 09:41:58.567860178 +0000 UTC m=+1014.539252165" observedRunningTime="2026-01-31 09:41:59.933310216 +0000 UTC m=+1015.904702223" watchObservedRunningTime="2026-01-31 09:41:59.93448521 +0000 UTC m=+1015.905877217" Jan 31 09:41:59 crc kubenswrapper[4992]: I0131 09:41:59.953681 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nszvc" podStartSLOduration=3.416705596 podStartE2EDuration="25.953662717s" podCreationTimestamp="2026-01-31 09:41:34 +0000 UTC" firstStartedPulling="2026-01-31 09:41:36.050340121 +0000 UTC m=+992.021732108" lastFinishedPulling="2026-01-31 09:41:58.587297242 +0000 UTC m=+1014.558689229" observedRunningTime="2026-01-31 09:41:59.950634109 +0000 UTC m=+1015.922026116" watchObservedRunningTime="2026-01-31 09:41:59.953662717 +0000 UTC m=+1015.925054704" Jan 31 09:42:04 crc kubenswrapper[4992]: I0131 09:42:04.441526 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w2697" Jan 31 09:42:04 crc kubenswrapper[4992]: I0131 09:42:04.683309 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-2j9gv" Jan 31 09:42:04 crc kubenswrapper[4992]: I0131 09:42:04.712156 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-d56qf" Jan 31 09:42:04 crc kubenswrapper[4992]: I0131 09:42:04.767715 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-4jkf5" Jan 31 09:42:04 crc kubenswrapper[4992]: I0131 09:42:04.802995 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khm8v" Jan 31 09:42:05 crc kubenswrapper[4992]: I0131 09:42:05.966325 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert\") pod \"infra-operator-controller-manager-79955696d6-98f7s\" (UID: \"d8bbd2ef-463b-430b-8332-a6f48ea54e75\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:42:05 crc kubenswrapper[4992]: I0131 09:42:05.972946 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8bbd2ef-463b-430b-8332-a6f48ea54e75-cert\") pod \"infra-operator-controller-manager-79955696d6-98f7s\" (UID: \"d8bbd2ef-463b-430b-8332-a6f48ea54e75\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.169344 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4\" (UID: \"185c6250-c2c7-4ff3-bfe6-6449f78269f2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.175104 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/185c6250-c2c7-4ff3-bfe6-6449f78269f2-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4\" (UID: \"185c6250-c2c7-4ff3-bfe6-6449f78269f2\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.185005 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kjhl4" Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.193248 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.230037 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-zn6k9" Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.238276 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.577467 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.577960 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.583255 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-metrics-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.583856 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cbf54a49-6a1c-49ae-80f6-5beee6be7377-webhook-certs\") pod \"openstack-operator-controller-manager-b84f98fd-qrd9s\" (UID: \"cbf54a49-6a1c-49ae-80f6-5beee6be7377\") " pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.613758 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-pxx7q" Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.622762 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.654035 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-98f7s"] Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.774675 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4"] Jan 31 09:42:06 crc kubenswrapper[4992]: W0131 09:42:06.779615 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod185c6250_c2c7_4ff3_bfe6_6449f78269f2.slice/crio-be1faa3ecf0d0cb31de3cba9641a4887c6c770fcfdecb80d86fb381fb6919d84 WatchSource:0}: Error finding container be1faa3ecf0d0cb31de3cba9641a4887c6c770fcfdecb80d86fb381fb6919d84: Status 404 returned error can't find the container with id be1faa3ecf0d0cb31de3cba9641a4887c6c770fcfdecb80d86fb381fb6919d84 Jan 31 09:42:06 crc kubenswrapper[4992]: W0131 09:42:06.870871 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbf54a49_6a1c_49ae_80f6_5beee6be7377.slice/crio-a3bab253cf70c482f22d288260086d71ec2be9f65e852820f9ede4582b4fa9c4 WatchSource:0}: Error finding container a3bab253cf70c482f22d288260086d71ec2be9f65e852820f9ede4582b4fa9c4: Status 404 returned error can't find the container with id a3bab253cf70c482f22d288260086d71ec2be9f65e852820f9ede4582b4fa9c4 Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.870999 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s"] Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.965776 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" event={"ID":"cbf54a49-6a1c-49ae-80f6-5beee6be7377","Type":"ContainerStarted","Data":"a3bab253cf70c482f22d288260086d71ec2be9f65e852820f9ede4582b4fa9c4"} Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.966829 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" event={"ID":"185c6250-c2c7-4ff3-bfe6-6449f78269f2","Type":"ContainerStarted","Data":"be1faa3ecf0d0cb31de3cba9641a4887c6c770fcfdecb80d86fb381fb6919d84"} Jan 31 09:42:06 crc kubenswrapper[4992]: I0131 09:42:06.967554 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" event={"ID":"d8bbd2ef-463b-430b-8332-a6f48ea54e75","Type":"ContainerStarted","Data":"c65825554e51d68c1a78b53171c3c4367ac3746408df31babf4ff74db61391ec"} Jan 31 09:42:14 crc kubenswrapper[4992]: I0131 09:42:14.019169 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" event={"ID":"cbf54a49-6a1c-49ae-80f6-5beee6be7377","Type":"ContainerStarted","Data":"f987a1fc0aff7ea37ba527faa2d42065f8197ea9003d34dc1ca9c3253ec11f00"} Jan 31 09:42:15 crc kubenswrapper[4992]: I0131 09:42:15.026317 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:42:15 crc kubenswrapper[4992]: I0131 09:42:15.222671 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" podStartSLOduration=41.222649663 podStartE2EDuration="41.222649663s" podCreationTimestamp="2026-01-31 09:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:42:15.112855015 +0000 UTC m=+1031.084247022" watchObservedRunningTime="2026-01-31 09:42:15.222649663 +0000 UTC m=+1031.194041650" Jan 31 09:42:15 crc kubenswrapper[4992]: I0131 09:42:15.301730 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:42:15 crc kubenswrapper[4992]: I0131 09:42:15.301844 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:42:15 crc kubenswrapper[4992]: I0131 09:42:15.301931 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:42:15 crc kubenswrapper[4992]: I0131 09:42:15.302664 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"56fd2e562c473f9f02a32edbe3694b09ca6daec109306548ace480ef8bb463a3"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:42:15 crc kubenswrapper[4992]: I0131 09:42:15.302739 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://56fd2e562c473f9f02a32edbe3694b09ca6daec109306548ace480ef8bb463a3" gracePeriod=600 Jan 31 09:42:16 crc kubenswrapper[4992]: I0131 09:42:16.040173 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="56fd2e562c473f9f02a32edbe3694b09ca6daec109306548ace480ef8bb463a3" exitCode=0 Jan 31 09:42:16 crc kubenswrapper[4992]: I0131 09:42:16.040261 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"56fd2e562c473f9f02a32edbe3694b09ca6daec109306548ace480ef8bb463a3"} Jan 31 09:42:16 crc kubenswrapper[4992]: I0131 09:42:16.040586 4992 scope.go:117] "RemoveContainer" containerID="dcfc3dfd610126640e5e892b66ddc4eb9fe55b9da0dd6e87ee131f4f08a55e7c" Jan 31 09:42:17 crc kubenswrapper[4992]: I0131 09:42:17.048106 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"eefc220641844057c58f4645845ce2f51a73e101cb77d772da4c569d245be5c5"} Jan 31 09:42:17 crc kubenswrapper[4992]: I0131 09:42:17.050412 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" event={"ID":"185c6250-c2c7-4ff3-bfe6-6449f78269f2","Type":"ContainerStarted","Data":"72a5e6041134217f12e4cfb126c4fb8c620a0ef71028b94ad5b1ef2d465bbd02"} Jan 31 09:42:17 crc kubenswrapper[4992]: I0131 09:42:17.050566 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:42:17 crc kubenswrapper[4992]: I0131 09:42:17.051607 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" event={"ID":"d8bbd2ef-463b-430b-8332-a6f48ea54e75","Type":"ContainerStarted","Data":"195ab3cad6b4cba234f18d9d280ac04c7fd84d9f2ecd5dde4ec445eed4198e5d"} Jan 31 09:42:17 crc kubenswrapper[4992]: I0131 09:42:17.051706 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:42:17 crc kubenswrapper[4992]: I0131 09:42:17.052790 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6944ddd655-mmsrl" event={"ID":"7496dda7-bc5b-4f0f-a93a-176f397fbeca","Type":"ContainerStarted","Data":"8d89e9983e9e78b068955ad2b09380893d9549df6cec8b47c3bfed5b9c31afca"} Jan 31 09:42:17 crc kubenswrapper[4992]: I0131 09:42:17.053173 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-6944ddd655-mmsrl" Jan 31 09:42:17 crc kubenswrapper[4992]: I0131 09:42:17.083204 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" podStartSLOduration=34.874253942 podStartE2EDuration="44.083188716s" podCreationTimestamp="2026-01-31 09:41:33 +0000 UTC" firstStartedPulling="2026-01-31 09:42:06.6605003 +0000 UTC m=+1022.631892307" lastFinishedPulling="2026-01-31 09:42:15.869435094 +0000 UTC m=+1031.840827081" observedRunningTime="2026-01-31 09:42:17.079753787 +0000 UTC m=+1033.051145804" watchObservedRunningTime="2026-01-31 09:42:17.083188716 +0000 UTC m=+1033.054580703" Jan 31 09:42:17 crc kubenswrapper[4992]: I0131 09:42:17.100549 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-6944ddd655-mmsrl" podStartSLOduration=3.300191362 podStartE2EDuration="43.10053038s" podCreationTimestamp="2026-01-31 09:41:34 +0000 UTC" firstStartedPulling="2026-01-31 09:41:36.018877518 +0000 UTC m=+991.990269505" lastFinishedPulling="2026-01-31 09:42:15.819216536 +0000 UTC m=+1031.790608523" observedRunningTime="2026-01-31 09:42:17.095665279 +0000 UTC m=+1033.067057276" watchObservedRunningTime="2026-01-31 09:42:17.10053038 +0000 UTC m=+1033.071922367" Jan 31 09:42:17 crc kubenswrapper[4992]: I0131 09:42:17.124789 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" podStartSLOduration=34.039183841 podStartE2EDuration="43.124770534s" podCreationTimestamp="2026-01-31 09:41:34 +0000 UTC" firstStartedPulling="2026-01-31 09:42:06.78174573 +0000 UTC m=+1022.753137717" lastFinishedPulling="2026-01-31 09:42:15.867332413 +0000 UTC m=+1031.838724410" observedRunningTime="2026-01-31 09:42:17.121764166 +0000 UTC m=+1033.093156163" watchObservedRunningTime="2026-01-31 09:42:17.124770534 +0000 UTC m=+1033.096162521" Jan 31 09:42:24 crc kubenswrapper[4992]: I0131 09:42:24.812467 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-6944ddd655-mmsrl" Jan 31 09:42:26 crc kubenswrapper[4992]: I0131 09:42:26.201824 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-98f7s" Jan 31 09:42:26 crc kubenswrapper[4992]: I0131 09:42:26.246638 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4" Jan 31 09:42:26 crc kubenswrapper[4992]: I0131 09:42:26.631047 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-b84f98fd-qrd9s" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.329587 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rwvbz"] Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.331671 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rwvbz" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.334053 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.334412 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.335351 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.335407 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-r7fnk" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.348442 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rwvbz"] Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.375405 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zc75s"] Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.376832 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.380800 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.397553 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zc75s"] Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.397636 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l6pt\" (UniqueName: \"kubernetes.io/projected/39503869-5316-4469-a57a-2fd445548c7e-kube-api-access-5l6pt\") pod \"dnsmasq-dns-675f4bcbfc-rwvbz\" (UID: \"39503869-5316-4469-a57a-2fd445548c7e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rwvbz" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.397725 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39503869-5316-4469-a57a-2fd445548c7e-config\") pod \"dnsmasq-dns-675f4bcbfc-rwvbz\" (UID: \"39503869-5316-4469-a57a-2fd445548c7e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rwvbz" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.498452 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf08a30-77ab-4371-bdaa-fa498b96136c-config\") pod \"dnsmasq-dns-78dd6ddcc-zc75s\" (UID: \"6bf08a30-77ab-4371-bdaa-fa498b96136c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.498558 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39503869-5316-4469-a57a-2fd445548c7e-config\") pod \"dnsmasq-dns-675f4bcbfc-rwvbz\" (UID: \"39503869-5316-4469-a57a-2fd445548c7e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rwvbz" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.499434 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39503869-5316-4469-a57a-2fd445548c7e-config\") pod \"dnsmasq-dns-675f4bcbfc-rwvbz\" (UID: \"39503869-5316-4469-a57a-2fd445548c7e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rwvbz" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.499507 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l6pt\" (UniqueName: \"kubernetes.io/projected/39503869-5316-4469-a57a-2fd445548c7e-kube-api-access-5l6pt\") pod \"dnsmasq-dns-675f4bcbfc-rwvbz\" (UID: \"39503869-5316-4469-a57a-2fd445548c7e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rwvbz" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.499875 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bf08a30-77ab-4371-bdaa-fa498b96136c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zc75s\" (UID: \"6bf08a30-77ab-4371-bdaa-fa498b96136c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.499949 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqk4t\" (UniqueName: \"kubernetes.io/projected/6bf08a30-77ab-4371-bdaa-fa498b96136c-kube-api-access-zqk4t\") pod \"dnsmasq-dns-78dd6ddcc-zc75s\" (UID: \"6bf08a30-77ab-4371-bdaa-fa498b96136c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.534008 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l6pt\" (UniqueName: \"kubernetes.io/projected/39503869-5316-4469-a57a-2fd445548c7e-kube-api-access-5l6pt\") pod \"dnsmasq-dns-675f4bcbfc-rwvbz\" (UID: \"39503869-5316-4469-a57a-2fd445548c7e\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rwvbz" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.600519 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bf08a30-77ab-4371-bdaa-fa498b96136c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zc75s\" (UID: \"6bf08a30-77ab-4371-bdaa-fa498b96136c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.600946 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqk4t\" (UniqueName: \"kubernetes.io/projected/6bf08a30-77ab-4371-bdaa-fa498b96136c-kube-api-access-zqk4t\") pod \"dnsmasq-dns-78dd6ddcc-zc75s\" (UID: \"6bf08a30-77ab-4371-bdaa-fa498b96136c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.601038 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf08a30-77ab-4371-bdaa-fa498b96136c-config\") pod \"dnsmasq-dns-78dd6ddcc-zc75s\" (UID: \"6bf08a30-77ab-4371-bdaa-fa498b96136c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.601272 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bf08a30-77ab-4371-bdaa-fa498b96136c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zc75s\" (UID: \"6bf08a30-77ab-4371-bdaa-fa498b96136c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.601935 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf08a30-77ab-4371-bdaa-fa498b96136c-config\") pod \"dnsmasq-dns-78dd6ddcc-zc75s\" (UID: \"6bf08a30-77ab-4371-bdaa-fa498b96136c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.624632 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqk4t\" (UniqueName: \"kubernetes.io/projected/6bf08a30-77ab-4371-bdaa-fa498b96136c-kube-api-access-zqk4t\") pod \"dnsmasq-dns-78dd6ddcc-zc75s\" (UID: \"6bf08a30-77ab-4371-bdaa-fa498b96136c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.655607 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rwvbz" Jan 31 09:42:43 crc kubenswrapper[4992]: I0131 09:42:43.696667 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" Jan 31 09:42:44 crc kubenswrapper[4992]: I0131 09:42:44.114770 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rwvbz"] Jan 31 09:42:44 crc kubenswrapper[4992]: I0131 09:42:44.125164 4992 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:42:44 crc kubenswrapper[4992]: I0131 09:42:44.188795 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zc75s"] Jan 31 09:42:44 crc kubenswrapper[4992]: W0131 09:42:44.193916 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bf08a30_77ab_4371_bdaa_fa498b96136c.slice/crio-6897ce7cec4c3f44f3d9fb8bad334d055c1f882c4dff09db4273e5bd705dbd60 WatchSource:0}: Error finding container 6897ce7cec4c3f44f3d9fb8bad334d055c1f882c4dff09db4273e5bd705dbd60: Status 404 returned error can't find the container with id 6897ce7cec4c3f44f3d9fb8bad334d055c1f882c4dff09db4273e5bd705dbd60 Jan 31 09:42:44 crc kubenswrapper[4992]: I0131 09:42:44.270842 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" event={"ID":"6bf08a30-77ab-4371-bdaa-fa498b96136c","Type":"ContainerStarted","Data":"6897ce7cec4c3f44f3d9fb8bad334d055c1f882c4dff09db4273e5bd705dbd60"} Jan 31 09:42:44 crc kubenswrapper[4992]: I0131 09:42:44.272007 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-rwvbz" event={"ID":"39503869-5316-4469-a57a-2fd445548c7e","Type":"ContainerStarted","Data":"2b1b7e8b76f364eaf144baad5c91133334c38f42fcafae72669be7e62951d1d4"} Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.049393 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rwvbz"] Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.080486 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hx8p5"] Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.082005 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.092057 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hx8p5"] Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.141525 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93164cd8-4706-4c83-b45f-a4a57d931b7d-config\") pod \"dnsmasq-dns-666b6646f7-hx8p5\" (UID: \"93164cd8-4706-4c83-b45f-a4a57d931b7d\") " pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.143502 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brjtx\" (UniqueName: \"kubernetes.io/projected/93164cd8-4706-4c83-b45f-a4a57d931b7d-kube-api-access-brjtx\") pod \"dnsmasq-dns-666b6646f7-hx8p5\" (UID: \"93164cd8-4706-4c83-b45f-a4a57d931b7d\") " pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.143534 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93164cd8-4706-4c83-b45f-a4a57d931b7d-dns-svc\") pod \"dnsmasq-dns-666b6646f7-hx8p5\" (UID: \"93164cd8-4706-4c83-b45f-a4a57d931b7d\") " pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.246921 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brjtx\" (UniqueName: \"kubernetes.io/projected/93164cd8-4706-4c83-b45f-a4a57d931b7d-kube-api-access-brjtx\") pod \"dnsmasq-dns-666b6646f7-hx8p5\" (UID: \"93164cd8-4706-4c83-b45f-a4a57d931b7d\") " pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.247696 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93164cd8-4706-4c83-b45f-a4a57d931b7d-dns-svc\") pod \"dnsmasq-dns-666b6646f7-hx8p5\" (UID: \"93164cd8-4706-4c83-b45f-a4a57d931b7d\") " pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.248699 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93164cd8-4706-4c83-b45f-a4a57d931b7d-config\") pod \"dnsmasq-dns-666b6646f7-hx8p5\" (UID: \"93164cd8-4706-4c83-b45f-a4a57d931b7d\") " pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.249990 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93164cd8-4706-4c83-b45f-a4a57d931b7d-dns-svc\") pod \"dnsmasq-dns-666b6646f7-hx8p5\" (UID: \"93164cd8-4706-4c83-b45f-a4a57d931b7d\") " pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.250287 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93164cd8-4706-4c83-b45f-a4a57d931b7d-config\") pod \"dnsmasq-dns-666b6646f7-hx8p5\" (UID: \"93164cd8-4706-4c83-b45f-a4a57d931b7d\") " pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.276131 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brjtx\" (UniqueName: \"kubernetes.io/projected/93164cd8-4706-4c83-b45f-a4a57d931b7d-kube-api-access-brjtx\") pod \"dnsmasq-dns-666b6646f7-hx8p5\" (UID: \"93164cd8-4706-4c83-b45f-a4a57d931b7d\") " pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.317222 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zc75s"] Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.373017 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-c966m"] Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.384412 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-c966m" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.404166 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-c966m"] Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.415754 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.454705 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7e2707b-6952-4179-8bc8-5d02c57126af-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-c966m\" (UID: \"c7e2707b-6952-4179-8bc8-5d02c57126af\") " pod="openstack/dnsmasq-dns-57d769cc4f-c966m" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.454957 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e2707b-6952-4179-8bc8-5d02c57126af-config\") pod \"dnsmasq-dns-57d769cc4f-c966m\" (UID: \"c7e2707b-6952-4179-8bc8-5d02c57126af\") " pod="openstack/dnsmasq-dns-57d769cc4f-c966m" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.455028 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzdq4\" (UniqueName: \"kubernetes.io/projected/c7e2707b-6952-4179-8bc8-5d02c57126af-kube-api-access-tzdq4\") pod \"dnsmasq-dns-57d769cc4f-c966m\" (UID: \"c7e2707b-6952-4179-8bc8-5d02c57126af\") " pod="openstack/dnsmasq-dns-57d769cc4f-c966m" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.556495 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e2707b-6952-4179-8bc8-5d02c57126af-config\") pod \"dnsmasq-dns-57d769cc4f-c966m\" (UID: \"c7e2707b-6952-4179-8bc8-5d02c57126af\") " pod="openstack/dnsmasq-dns-57d769cc4f-c966m" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.556840 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzdq4\" (UniqueName: \"kubernetes.io/projected/c7e2707b-6952-4179-8bc8-5d02c57126af-kube-api-access-tzdq4\") pod \"dnsmasq-dns-57d769cc4f-c966m\" (UID: \"c7e2707b-6952-4179-8bc8-5d02c57126af\") " pod="openstack/dnsmasq-dns-57d769cc4f-c966m" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.556959 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7e2707b-6952-4179-8bc8-5d02c57126af-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-c966m\" (UID: \"c7e2707b-6952-4179-8bc8-5d02c57126af\") " pod="openstack/dnsmasq-dns-57d769cc4f-c966m" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.558061 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7e2707b-6952-4179-8bc8-5d02c57126af-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-c966m\" (UID: \"c7e2707b-6952-4179-8bc8-5d02c57126af\") " pod="openstack/dnsmasq-dns-57d769cc4f-c966m" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.560872 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e2707b-6952-4179-8bc8-5d02c57126af-config\") pod \"dnsmasq-dns-57d769cc4f-c966m\" (UID: \"c7e2707b-6952-4179-8bc8-5d02c57126af\") " pod="openstack/dnsmasq-dns-57d769cc4f-c966m" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.622679 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzdq4\" (UniqueName: \"kubernetes.io/projected/c7e2707b-6952-4179-8bc8-5d02c57126af-kube-api-access-tzdq4\") pod \"dnsmasq-dns-57d769cc4f-c966m\" (UID: \"c7e2707b-6952-4179-8bc8-5d02c57126af\") " pod="openstack/dnsmasq-dns-57d769cc4f-c966m" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.806719 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-c966m" Jan 31 09:42:46 crc kubenswrapper[4992]: I0131 09:42:46.953350 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hx8p5"] Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.210846 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.212812 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.214767 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.216012 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.216455 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-c966m"] Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.217729 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.218039 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ft75t" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.218096 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.218276 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.218284 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.223775 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.304708 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" event={"ID":"93164cd8-4706-4c83-b45f-a4a57d931b7d","Type":"ContainerStarted","Data":"d363553e65f08ba52654c7b67074f8985e17dabebbc99cf4d213a67a914fa429"} Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.365530 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/71b7a97b-2d62-4b05-84f6-fc720ce9c672-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.365608 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.365639 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.365657 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.365692 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.365717 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.365751 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-config-data\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.365822 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.365893 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/71b7a97b-2d62-4b05-84f6-fc720ce9c672-pod-info\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.365912 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-server-conf\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.365936 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8swl\" (UniqueName: \"kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-kube-api-access-s8swl\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.467191 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.467232 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.467270 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.467299 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.467335 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-config-data\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.467388 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.467446 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/71b7a97b-2d62-4b05-84f6-fc720ce9c672-pod-info\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.467460 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-server-conf\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.467476 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8swl\" (UniqueName: \"kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-kube-api-access-s8swl\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.467499 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/71b7a97b-2d62-4b05-84f6-fc720ce9c672-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.467524 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.467762 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.468393 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.468581 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.468912 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.469456 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-server-conf\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.469506 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-config-data\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.476002 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/71b7a97b-2d62-4b05-84f6-fc720ce9c672-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.476043 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/71b7a97b-2d62-4b05-84f6-fc720ce9c672-pod-info\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.476217 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.476561 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.483058 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8swl\" (UniqueName: \"kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-kube-api-access-s8swl\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.495577 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.537737 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.569546 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.572347 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.576884 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.577198 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-9l44f" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.579452 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.579687 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.581175 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.581399 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.583466 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.584780 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.672122 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.672162 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.672194 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8005e2e7-ed00-4af1-be65-12638ce3a9f9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.672225 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.672240 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.672255 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.672286 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcvfl\" (UniqueName: \"kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-kube-api-access-hcvfl\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.672325 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.672344 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.672361 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.672383 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8005e2e7-ed00-4af1-be65-12638ce3a9f9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.775152 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8005e2e7-ed00-4af1-be65-12638ce3a9f9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.775200 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.775220 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.775251 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8005e2e7-ed00-4af1-be65-12638ce3a9f9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.775279 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.775297 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.775312 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.775343 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcvfl\" (UniqueName: \"kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-kube-api-access-hcvfl\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.775383 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.775427 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.775443 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.776124 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.776170 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.776205 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.776970 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.777266 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.777595 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.779575 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.789296 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8005e2e7-ed00-4af1-be65-12638ce3a9f9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.790787 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8005e2e7-ed00-4af1-be65-12638ce3a9f9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.792188 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.796266 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcvfl\" (UniqueName: \"kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-kube-api-access-hcvfl\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.808475 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:47 crc kubenswrapper[4992]: I0131 09:42:47.913485 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.685482 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.690595 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.692320 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.695162 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.698455 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.698879 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.698907 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-8xvdr" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.705093 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.789248 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdt8v\" (UniqueName: \"kubernetes.io/projected/29ab568d-b0c6-4420-b5fb-d027c4561e2f-kube-api-access-xdt8v\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.789325 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.789401 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/29ab568d-b0c6-4420-b5fb-d027c4561e2f-config-data-default\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.789443 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29ab568d-b0c6-4420-b5fb-d027c4561e2f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.789463 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/29ab568d-b0c6-4420-b5fb-d027c4561e2f-kolla-config\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.789480 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/29ab568d-b0c6-4420-b5fb-d027c4561e2f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.789527 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/29ab568d-b0c6-4420-b5fb-d027c4561e2f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.789562 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ab568d-b0c6-4420-b5fb-d027c4561e2f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.893526 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdt8v\" (UniqueName: \"kubernetes.io/projected/29ab568d-b0c6-4420-b5fb-d027c4561e2f-kube-api-access-xdt8v\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.893584 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.893631 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/29ab568d-b0c6-4420-b5fb-d027c4561e2f-config-data-default\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.893649 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29ab568d-b0c6-4420-b5fb-d027c4561e2f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.893668 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/29ab568d-b0c6-4420-b5fb-d027c4561e2f-kolla-config\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.893686 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/29ab568d-b0c6-4420-b5fb-d027c4561e2f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.893717 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/29ab568d-b0c6-4420-b5fb-d027c4561e2f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.893753 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ab568d-b0c6-4420-b5fb-d027c4561e2f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.894659 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/29ab568d-b0c6-4420-b5fb-d027c4561e2f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.895179 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/29ab568d-b0c6-4420-b5fb-d027c4561e2f-kolla-config\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.895338 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.898031 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29ab568d-b0c6-4420-b5fb-d027c4561e2f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.898883 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/29ab568d-b0c6-4420-b5fb-d027c4561e2f-config-data-default\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.901272 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/29ab568d-b0c6-4420-b5fb-d027c4561e2f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.914324 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ab568d-b0c6-4420-b5fb-d027c4561e2f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.918822 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdt8v\" (UniqueName: \"kubernetes.io/projected/29ab568d-b0c6-4420-b5fb-d027c4561e2f-kube-api-access-xdt8v\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:48 crc kubenswrapper[4992]: I0131 09:42:48.933510 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"29ab568d-b0c6-4420-b5fb-d027c4561e2f\") " pod="openstack/openstack-galera-0" Jan 31 09:42:49 crc kubenswrapper[4992]: I0131 09:42:49.038135 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.008925 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.010109 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.012078 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-7b7l6" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.012561 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.012629 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.014173 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.054727 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.110750 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/71459842-ede4-4fcc-9e61-d884a02b341e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.110844 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/71459842-ede4-4fcc-9e61-d884a02b341e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.110957 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/71459842-ede4-4fcc-9e61-d884a02b341e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.111066 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.111100 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71459842-ede4-4fcc-9e61-d884a02b341e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.111142 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71459842-ede4-4fcc-9e61-d884a02b341e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.111172 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j25z7\" (UniqueName: \"kubernetes.io/projected/71459842-ede4-4fcc-9e61-d884a02b341e-kube-api-access-j25z7\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.111195 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71459842-ede4-4fcc-9e61-d884a02b341e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.170578 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.171404 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.173657 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-75ctl" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.175757 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.179684 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.189838 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.212696 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.212742 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71459842-ede4-4fcc-9e61-d884a02b341e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.212782 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71459842-ede4-4fcc-9e61-d884a02b341e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.212818 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j25z7\" (UniqueName: \"kubernetes.io/projected/71459842-ede4-4fcc-9e61-d884a02b341e-kube-api-access-j25z7\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.212843 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71459842-ede4-4fcc-9e61-d884a02b341e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.212915 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/71459842-ede4-4fcc-9e61-d884a02b341e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.212965 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.213882 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/71459842-ede4-4fcc-9e61-d884a02b341e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.213921 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/71459842-ede4-4fcc-9e61-d884a02b341e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.214274 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/71459842-ede4-4fcc-9e61-d884a02b341e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.214320 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71459842-ede4-4fcc-9e61-d884a02b341e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.214378 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/71459842-ede4-4fcc-9e61-d884a02b341e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.214937 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71459842-ede4-4fcc-9e61-d884a02b341e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.217785 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/71459842-ede4-4fcc-9e61-d884a02b341e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.219257 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71459842-ede4-4fcc-9e61-d884a02b341e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.231846 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j25z7\" (UniqueName: \"kubernetes.io/projected/71459842-ede4-4fcc-9e61-d884a02b341e-kube-api-access-j25z7\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.235345 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"71459842-ede4-4fcc-9e61-d884a02b341e\") " pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.315093 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7e111be8-674c-4a0a-84b1-4a08fa98391f-kolla-config\") pod \"memcached-0\" (UID: \"7e111be8-674c-4a0a-84b1-4a08fa98391f\") " pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.315480 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e111be8-674c-4a0a-84b1-4a08fa98391f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7e111be8-674c-4a0a-84b1-4a08fa98391f\") " pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.315513 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e111be8-674c-4a0a-84b1-4a08fa98391f-config-data\") pod \"memcached-0\" (UID: \"7e111be8-674c-4a0a-84b1-4a08fa98391f\") " pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.315565 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrqxr\" (UniqueName: \"kubernetes.io/projected/7e111be8-674c-4a0a-84b1-4a08fa98391f-kube-api-access-rrqxr\") pod \"memcached-0\" (UID: \"7e111be8-674c-4a0a-84b1-4a08fa98391f\") " pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.315622 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e111be8-674c-4a0a-84b1-4a08fa98391f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7e111be8-674c-4a0a-84b1-4a08fa98391f\") " pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.342383 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.416929 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrqxr\" (UniqueName: \"kubernetes.io/projected/7e111be8-674c-4a0a-84b1-4a08fa98391f-kube-api-access-rrqxr\") pod \"memcached-0\" (UID: \"7e111be8-674c-4a0a-84b1-4a08fa98391f\") " pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.416987 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e111be8-674c-4a0a-84b1-4a08fa98391f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7e111be8-674c-4a0a-84b1-4a08fa98391f\") " pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.417053 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7e111be8-674c-4a0a-84b1-4a08fa98391f-kolla-config\") pod \"memcached-0\" (UID: \"7e111be8-674c-4a0a-84b1-4a08fa98391f\") " pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.417100 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e111be8-674c-4a0a-84b1-4a08fa98391f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7e111be8-674c-4a0a-84b1-4a08fa98391f\") " pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.417121 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e111be8-674c-4a0a-84b1-4a08fa98391f-config-data\") pod \"memcached-0\" (UID: \"7e111be8-674c-4a0a-84b1-4a08fa98391f\") " pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.417986 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e111be8-674c-4a0a-84b1-4a08fa98391f-config-data\") pod \"memcached-0\" (UID: \"7e111be8-674c-4a0a-84b1-4a08fa98391f\") " pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.417994 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7e111be8-674c-4a0a-84b1-4a08fa98391f-kolla-config\") pod \"memcached-0\" (UID: \"7e111be8-674c-4a0a-84b1-4a08fa98391f\") " pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.421383 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e111be8-674c-4a0a-84b1-4a08fa98391f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7e111be8-674c-4a0a-84b1-4a08fa98391f\") " pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.422039 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e111be8-674c-4a0a-84b1-4a08fa98391f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7e111be8-674c-4a0a-84b1-4a08fa98391f\") " pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.437405 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrqxr\" (UniqueName: \"kubernetes.io/projected/7e111be8-674c-4a0a-84b1-4a08fa98391f-kube-api-access-rrqxr\") pod \"memcached-0\" (UID: \"7e111be8-674c-4a0a-84b1-4a08fa98391f\") " pod="openstack/memcached-0" Jan 31 09:42:50 crc kubenswrapper[4992]: I0131 09:42:50.488318 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 31 09:42:52 crc kubenswrapper[4992]: I0131 09:42:52.280566 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:42:52 crc kubenswrapper[4992]: I0131 09:42:52.281742 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 31 09:42:52 crc kubenswrapper[4992]: I0131 09:42:52.286207 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-xcxzq" Jan 31 09:42:52 crc kubenswrapper[4992]: I0131 09:42:52.296200 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:42:52 crc kubenswrapper[4992]: I0131 09:42:52.344023 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbkcx\" (UniqueName: \"kubernetes.io/projected/1f9647cf-fe89-42c1-bce1-2075d04e658e-kube-api-access-sbkcx\") pod \"kube-state-metrics-0\" (UID: \"1f9647cf-fe89-42c1-bce1-2075d04e658e\") " pod="openstack/kube-state-metrics-0" Jan 31 09:42:52 crc kubenswrapper[4992]: I0131 09:42:52.445578 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbkcx\" (UniqueName: \"kubernetes.io/projected/1f9647cf-fe89-42c1-bce1-2075d04e658e-kube-api-access-sbkcx\") pod \"kube-state-metrics-0\" (UID: \"1f9647cf-fe89-42c1-bce1-2075d04e658e\") " pod="openstack/kube-state-metrics-0" Jan 31 09:42:52 crc kubenswrapper[4992]: I0131 09:42:52.461989 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbkcx\" (UniqueName: \"kubernetes.io/projected/1f9647cf-fe89-42c1-bce1-2075d04e658e-kube-api-access-sbkcx\") pod \"kube-state-metrics-0\" (UID: \"1f9647cf-fe89-42c1-bce1-2075d04e658e\") " pod="openstack/kube-state-metrics-0" Jan 31 09:42:52 crc kubenswrapper[4992]: I0131 09:42:52.601598 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 31 09:42:54 crc kubenswrapper[4992]: I0131 09:42:54.347744 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-c966m" event={"ID":"c7e2707b-6952-4179-8bc8-5d02c57126af","Type":"ContainerStarted","Data":"df6572e2dcb41489a3e45b061ce21c4a6b2b656d4320ffcd65ac71cac20286af"} Jan 31 09:42:55 crc kubenswrapper[4992]: I0131 09:42:55.473709 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.091138 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vxtkq"] Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.099315 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.102992 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.103067 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-xlw6m" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.103115 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.123428 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-l45p8"] Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.125100 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.139626 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vxtkq"] Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.148578 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-l45p8"] Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.225148 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d5e4e817-6532-4d1d-85f8-649dae6babac-var-lib\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.225239 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rnhc\" (UniqueName: \"kubernetes.io/projected/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-kube-api-access-9rnhc\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.225319 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-var-run-ovn\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.225343 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-combined-ca-bundle\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.225400 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-var-run\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.225461 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5e4e817-6532-4d1d-85f8-649dae6babac-scripts\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.225493 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d5e4e817-6532-4d1d-85f8-649dae6babac-var-log\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.225624 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d5e4e817-6532-4d1d-85f8-649dae6babac-etc-ovs\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.225654 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5e4e817-6532-4d1d-85f8-649dae6babac-var-run\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.225698 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdw7v\" (UniqueName: \"kubernetes.io/projected/d5e4e817-6532-4d1d-85f8-649dae6babac-kube-api-access-fdw7v\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.225722 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-var-log-ovn\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.225785 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-scripts\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.225820 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-ovn-controller-tls-certs\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.328044 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-var-run\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.328124 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5e4e817-6532-4d1d-85f8-649dae6babac-scripts\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.328181 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d5e4e817-6532-4d1d-85f8-649dae6babac-var-log\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.328224 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d5e4e817-6532-4d1d-85f8-649dae6babac-etc-ovs\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.328254 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5e4e817-6532-4d1d-85f8-649dae6babac-var-run\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.328272 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdw7v\" (UniqueName: \"kubernetes.io/projected/d5e4e817-6532-4d1d-85f8-649dae6babac-kube-api-access-fdw7v\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.328294 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-var-log-ovn\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.328347 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-scripts\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.328377 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-ovn-controller-tls-certs\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.328413 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d5e4e817-6532-4d1d-85f8-649dae6babac-var-lib\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.328502 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rnhc\" (UniqueName: \"kubernetes.io/projected/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-kube-api-access-9rnhc\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.328901 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-var-run-ovn\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.329004 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-combined-ca-bundle\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.331354 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d5e4e817-6532-4d1d-85f8-649dae6babac-etc-ovs\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.331758 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-var-run-ovn\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.331773 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d5e4e817-6532-4d1d-85f8-649dae6babac-var-lib\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.331860 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-var-run\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.331935 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5e4e817-6532-4d1d-85f8-649dae6babac-var-run\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.332165 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d5e4e817-6532-4d1d-85f8-649dae6babac-var-log\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.332348 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-var-log-ovn\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.332438 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5e4e817-6532-4d1d-85f8-649dae6babac-scripts\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.334803 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-scripts\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.341929 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-ovn-controller-tls-certs\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.341931 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-combined-ca-bundle\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.353741 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdw7v\" (UniqueName: \"kubernetes.io/projected/d5e4e817-6532-4d1d-85f8-649dae6babac-kube-api-access-fdw7v\") pod \"ovn-controller-ovs-l45p8\" (UID: \"d5e4e817-6532-4d1d-85f8-649dae6babac\") " pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.365180 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rnhc\" (UniqueName: \"kubernetes.io/projected/9aca01f7-a9ff-4d25-a330-c505e93a3cd0-kube-api-access-9rnhc\") pod \"ovn-controller-vxtkq\" (UID: \"9aca01f7-a9ff-4d25-a330-c505e93a3cd0\") " pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.424167 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vxtkq" Jan 31 09:42:56 crc kubenswrapper[4992]: I0131 09:42:56.450745 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.599119 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.600644 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.602592 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-7hr9w" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.603129 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.603650 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.603770 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.604980 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.615406 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.759713 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5mst\" (UniqueName: \"kubernetes.io/projected/a6c51daf-48d3-4c48-b615-83aede3b27fa-kube-api-access-j5mst\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.759768 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.759809 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6c51daf-48d3-4c48-b615-83aede3b27fa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.759841 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6c51daf-48d3-4c48-b615-83aede3b27fa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.759927 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c51daf-48d3-4c48-b615-83aede3b27fa-config\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.759953 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6c51daf-48d3-4c48-b615-83aede3b27fa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.760089 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c51daf-48d3-4c48-b615-83aede3b27fa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.760283 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a6c51daf-48d3-4c48-b615-83aede3b27fa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: W0131 09:42:57.820431 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71b7a97b_2d62_4b05_84f6_fc720ce9c672.slice/crio-ee64adbfda205b91b125ea43c91390d00e39c2c74180d500c7527cb06988dc53 WatchSource:0}: Error finding container ee64adbfda205b91b125ea43c91390d00e39c2c74180d500c7527cb06988dc53: Status 404 returned error can't find the container with id ee64adbfda205b91b125ea43c91390d00e39c2c74180d500c7527cb06988dc53 Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.862204 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5mst\" (UniqueName: \"kubernetes.io/projected/a6c51daf-48d3-4c48-b615-83aede3b27fa-kube-api-access-j5mst\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.862251 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.862275 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6c51daf-48d3-4c48-b615-83aede3b27fa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.862296 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6c51daf-48d3-4c48-b615-83aede3b27fa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.862336 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c51daf-48d3-4c48-b615-83aede3b27fa-config\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.862352 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6c51daf-48d3-4c48-b615-83aede3b27fa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.862387 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c51daf-48d3-4c48-b615-83aede3b27fa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.862636 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a6c51daf-48d3-4c48-b615-83aede3b27fa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.863012 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.864361 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6c51daf-48d3-4c48-b615-83aede3b27fa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.864400 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a6c51daf-48d3-4c48-b615-83aede3b27fa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.865273 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c51daf-48d3-4c48-b615-83aede3b27fa-config\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.870222 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6c51daf-48d3-4c48-b615-83aede3b27fa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.872181 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6c51daf-48d3-4c48-b615-83aede3b27fa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.874757 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c51daf-48d3-4c48-b615-83aede3b27fa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.881751 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5mst\" (UniqueName: \"kubernetes.io/projected/a6c51daf-48d3-4c48-b615-83aede3b27fa-kube-api-access-j5mst\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.897868 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a6c51daf-48d3-4c48-b615-83aede3b27fa\") " pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:57 crc kubenswrapper[4992]: I0131 09:42:57.941688 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 31 09:42:58 crc kubenswrapper[4992]: I0131 09:42:58.391527 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"71b7a97b-2d62-4b05-84f6-fc720ce9c672","Type":"ContainerStarted","Data":"ee64adbfda205b91b125ea43c91390d00e39c2c74180d500c7527cb06988dc53"} Jan 31 09:42:58 crc kubenswrapper[4992]: I0131 09:42:58.951790 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 31 09:42:58 crc kubenswrapper[4992]: I0131 09:42:58.953044 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:58 crc kubenswrapper[4992]: I0131 09:42:58.955307 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 31 09:42:58 crc kubenswrapper[4992]: I0131 09:42:58.955807 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 31 09:42:58 crc kubenswrapper[4992]: I0131 09:42:58.955883 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 31 09:42:58 crc kubenswrapper[4992]: I0131 09:42:58.957066 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-pthnz" Jan 31 09:42:58 crc kubenswrapper[4992]: I0131 09:42:58.962048 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.094644 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d46da54e-1322-4b24-90f2-0929ae4711c2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.094723 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46da54e-1322-4b24-90f2-0929ae4711c2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.094757 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.094785 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46da54e-1322-4b24-90f2-0929ae4711c2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.094845 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d46da54e-1322-4b24-90f2-0929ae4711c2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.094870 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d46da54e-1322-4b24-90f2-0929ae4711c2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.094889 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d46da54e-1322-4b24-90f2-0929ae4711c2-config\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.094918 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjrpr\" (UniqueName: \"kubernetes.io/projected/d46da54e-1322-4b24-90f2-0929ae4711c2-kube-api-access-kjrpr\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.196110 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d46da54e-1322-4b24-90f2-0929ae4711c2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.196153 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d46da54e-1322-4b24-90f2-0929ae4711c2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.196168 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d46da54e-1322-4b24-90f2-0929ae4711c2-config\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.196200 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjrpr\" (UniqueName: \"kubernetes.io/projected/d46da54e-1322-4b24-90f2-0929ae4711c2-kube-api-access-kjrpr\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.196250 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d46da54e-1322-4b24-90f2-0929ae4711c2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.196302 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46da54e-1322-4b24-90f2-0929ae4711c2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.196342 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.196370 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46da54e-1322-4b24-90f2-0929ae4711c2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.196877 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d46da54e-1322-4b24-90f2-0929ae4711c2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.197149 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.198204 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d46da54e-1322-4b24-90f2-0929ae4711c2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.198801 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d46da54e-1322-4b24-90f2-0929ae4711c2-config\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.201831 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46da54e-1322-4b24-90f2-0929ae4711c2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.214491 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46da54e-1322-4b24-90f2-0929ae4711c2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.215823 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.220133 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjrpr\" (UniqueName: \"kubernetes.io/projected/d46da54e-1322-4b24-90f2-0929ae4711c2-kube-api-access-kjrpr\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.224472 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d46da54e-1322-4b24-90f2-0929ae4711c2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d46da54e-1322-4b24-90f2-0929ae4711c2\") " pod="openstack/ovsdbserver-sb-0" Jan 31 09:42:59 crc kubenswrapper[4992]: I0131 09:42:59.271251 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 31 09:43:03 crc kubenswrapper[4992]: E0131 09:43:03.135069 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 31 09:43:03 crc kubenswrapper[4992]: E0131 09:43:03.135292 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zqk4t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-zc75s_openstack(6bf08a30-77ab-4371-bdaa-fa498b96136c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 09:43:03 crc kubenswrapper[4992]: E0131 09:43:03.136486 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" podUID="6bf08a30-77ab-4371-bdaa-fa498b96136c" Jan 31 09:43:03 crc kubenswrapper[4992]: E0131 09:43:03.200068 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 31 09:43:03 crc kubenswrapper[4992]: E0131 09:43:03.200191 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5l6pt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-rwvbz_openstack(39503869-5316-4469-a57a-2fd445548c7e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 09:43:03 crc kubenswrapper[4992]: E0131 09:43:03.202013 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-rwvbz" podUID="39503869-5316-4469-a57a-2fd445548c7e" Jan 31 09:43:03 crc kubenswrapper[4992]: I0131 09:43:03.549748 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 31 09:43:03 crc kubenswrapper[4992]: W0131 09:43:03.629555 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e111be8_674c_4a0a_84b1_4a08fa98391f.slice/crio-04511ad7a4645e1343f6496a7ee57a1fa17c8aedb9f8450517cdbeec125af8e6 WatchSource:0}: Error finding container 04511ad7a4645e1343f6496a7ee57a1fa17c8aedb9f8450517cdbeec125af8e6: Status 404 returned error can't find the container with id 04511ad7a4645e1343f6496a7ee57a1fa17c8aedb9f8450517cdbeec125af8e6 Jan 31 09:43:03 crc kubenswrapper[4992]: I0131 09:43:03.871434 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:43:03 crc kubenswrapper[4992]: I0131 09:43:03.876936 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 31 09:43:03 crc kubenswrapper[4992]: W0131 09:43:03.895605 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8005e2e7_ed00_4af1_be65_12638ce3a9f9.slice/crio-d35359be5ec36ab778e90725b0cac71425cab881f43b8bbbc3a86ca2f43ace88 WatchSource:0}: Error finding container d35359be5ec36ab778e90725b0cac71425cab881f43b8bbbc3a86ca2f43ace88: Status 404 returned error can't find the container with id d35359be5ec36ab778e90725b0cac71425cab881f43b8bbbc3a86ca2f43ace88 Jan 31 09:43:03 crc kubenswrapper[4992]: W0131 09:43:03.904665 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71459842_ede4_4fcc_9e61_d884a02b341e.slice/crio-06d03a20a55a88dc49e6c5bf89432e99a0f22f0d79c52435fb80a75c67018ceb WatchSource:0}: Error finding container 06d03a20a55a88dc49e6c5bf89432e99a0f22f0d79c52435fb80a75c67018ceb: Status 404 returned error can't find the container with id 06d03a20a55a88dc49e6c5bf89432e99a0f22f0d79c52435fb80a75c67018ceb Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.023648 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.028364 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rwvbz" Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.164545 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vxtkq"] Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.210303 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l6pt\" (UniqueName: \"kubernetes.io/projected/39503869-5316-4469-a57a-2fd445548c7e-kube-api-access-5l6pt\") pod \"39503869-5316-4469-a57a-2fd445548c7e\" (UID: \"39503869-5316-4469-a57a-2fd445548c7e\") " Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.210670 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf08a30-77ab-4371-bdaa-fa498b96136c-config\") pod \"6bf08a30-77ab-4371-bdaa-fa498b96136c\" (UID: \"6bf08a30-77ab-4371-bdaa-fa498b96136c\") " Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.210697 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39503869-5316-4469-a57a-2fd445548c7e-config\") pod \"39503869-5316-4469-a57a-2fd445548c7e\" (UID: \"39503869-5316-4469-a57a-2fd445548c7e\") " Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.210800 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqk4t\" (UniqueName: \"kubernetes.io/projected/6bf08a30-77ab-4371-bdaa-fa498b96136c-kube-api-access-zqk4t\") pod \"6bf08a30-77ab-4371-bdaa-fa498b96136c\" (UID: \"6bf08a30-77ab-4371-bdaa-fa498b96136c\") " Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.210836 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bf08a30-77ab-4371-bdaa-fa498b96136c-dns-svc\") pod \"6bf08a30-77ab-4371-bdaa-fa498b96136c\" (UID: \"6bf08a30-77ab-4371-bdaa-fa498b96136c\") " Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.211966 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bf08a30-77ab-4371-bdaa-fa498b96136c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6bf08a30-77ab-4371-bdaa-fa498b96136c" (UID: "6bf08a30-77ab-4371-bdaa-fa498b96136c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.212379 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bf08a30-77ab-4371-bdaa-fa498b96136c-config" (OuterVolumeSpecName: "config") pod "6bf08a30-77ab-4371-bdaa-fa498b96136c" (UID: "6bf08a30-77ab-4371-bdaa-fa498b96136c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.212779 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39503869-5316-4469-a57a-2fd445548c7e-config" (OuterVolumeSpecName: "config") pod "39503869-5316-4469-a57a-2fd445548c7e" (UID: "39503869-5316-4469-a57a-2fd445548c7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.212830 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.219610 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bf08a30-77ab-4371-bdaa-fa498b96136c-kube-api-access-zqk4t" (OuterVolumeSpecName: "kube-api-access-zqk4t") pod "6bf08a30-77ab-4371-bdaa-fa498b96136c" (UID: "6bf08a30-77ab-4371-bdaa-fa498b96136c"). InnerVolumeSpecName "kube-api-access-zqk4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.237587 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39503869-5316-4469-a57a-2fd445548c7e-kube-api-access-5l6pt" (OuterVolumeSpecName: "kube-api-access-5l6pt") pod "39503869-5316-4469-a57a-2fd445548c7e" (UID: "39503869-5316-4469-a57a-2fd445548c7e"). InnerVolumeSpecName "kube-api-access-5l6pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.263022 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.314071 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqk4t\" (UniqueName: \"kubernetes.io/projected/6bf08a30-77ab-4371-bdaa-fa498b96136c-kube-api-access-zqk4t\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.314113 4992 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bf08a30-77ab-4371-bdaa-fa498b96136c-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.314124 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l6pt\" (UniqueName: \"kubernetes.io/projected/39503869-5316-4469-a57a-2fd445548c7e-kube-api-access-5l6pt\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.314133 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf08a30-77ab-4371-bdaa-fa498b96136c-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.314144 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39503869-5316-4469-a57a-2fd445548c7e-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.377568 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-l45p8"] Jan 31 09:43:04 crc kubenswrapper[4992]: W0131 09:43:04.386525 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5e4e817_6532_4d1d_85f8_649dae6babac.slice/crio-e1afd6309ce99d9afc85913774b88efe18565ede409755089dba2c993016bc08 WatchSource:0}: Error finding container e1afd6309ce99d9afc85913774b88efe18565ede409755089dba2c993016bc08: Status 404 returned error can't find the container with id e1afd6309ce99d9afc85913774b88efe18565ede409755089dba2c993016bc08 Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.431835 4992 generic.go:334] "Generic (PLEG): container finished" podID="c7e2707b-6952-4179-8bc8-5d02c57126af" containerID="ac42dbe37165c33365d2c1fb166f3b986626db8f125c721e733fb9e3bfa49a24" exitCode=0 Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.431901 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-c966m" event={"ID":"c7e2707b-6952-4179-8bc8-5d02c57126af","Type":"ContainerDied","Data":"ac42dbe37165c33365d2c1fb166f3b986626db8f125c721e733fb9e3bfa49a24"} Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.433701 4992 generic.go:334] "Generic (PLEG): container finished" podID="93164cd8-4706-4c83-b45f-a4a57d931b7d" containerID="e2c33fe90e390830dbef2664b2e24e5acb1b932208ec9a8f7b2259740a66b315" exitCode=0 Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.433754 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" event={"ID":"93164cd8-4706-4c83-b45f-a4a57d931b7d","Type":"ContainerDied","Data":"e2c33fe90e390830dbef2664b2e24e5acb1b932208ec9a8f7b2259740a66b315"} Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.436614 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" event={"ID":"6bf08a30-77ab-4371-bdaa-fa498b96136c","Type":"ContainerDied","Data":"6897ce7cec4c3f44f3d9fb8bad334d055c1f882c4dff09db4273e5bd705dbd60"} Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.436685 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zc75s" Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.438007 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1f9647cf-fe89-42c1-bce1-2075d04e658e","Type":"ContainerStarted","Data":"ed72c179099a2218c009f4b2ec941672ba9eb7e2773005062a0b58c391757397"} Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.439492 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"29ab568d-b0c6-4420-b5fb-d027c4561e2f","Type":"ContainerStarted","Data":"699a72d97662c849739e14d1509b6fdb6eaca7632916e02717b53697054eac1d"} Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.441678 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-rwvbz" event={"ID":"39503869-5316-4469-a57a-2fd445548c7e","Type":"ContainerDied","Data":"2b1b7e8b76f364eaf144baad5c91133334c38f42fcafae72669be7e62951d1d4"} Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.441718 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rwvbz" Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.442931 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7e111be8-674c-4a0a-84b1-4a08fa98391f","Type":"ContainerStarted","Data":"04511ad7a4645e1343f6496a7ee57a1fa17c8aedb9f8450517cdbeec125af8e6"} Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.443974 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8005e2e7-ed00-4af1-be65-12638ce3a9f9","Type":"ContainerStarted","Data":"d35359be5ec36ab778e90725b0cac71425cab881f43b8bbbc3a86ca2f43ace88"} Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.445799 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"71459842-ede4-4fcc-9e61-d884a02b341e","Type":"ContainerStarted","Data":"06d03a20a55a88dc49e6c5bf89432e99a0f22f0d79c52435fb80a75c67018ceb"} Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.449754 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l45p8" event={"ID":"d5e4e817-6532-4d1d-85f8-649dae6babac","Type":"ContainerStarted","Data":"e1afd6309ce99d9afc85913774b88efe18565ede409755089dba2c993016bc08"} Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.459305 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vxtkq" event={"ID":"9aca01f7-a9ff-4d25-a330-c505e93a3cd0","Type":"ContainerStarted","Data":"9a16ac78a56d5bee9f91c26f8397d21e8103d61c04756407fc02d1f761b4e797"} Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.522540 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zc75s"] Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.567141 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zc75s"] Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.581876 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rwvbz"] Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.585959 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rwvbz"] Jan 31 09:43:04 crc kubenswrapper[4992]: E0131 09:43:04.700199 4992 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Jan 31 09:43:04 crc kubenswrapper[4992]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/93164cd8-4706-4c83-b45f-a4a57d931b7d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 31 09:43:04 crc kubenswrapper[4992]: > podSandboxID="d363553e65f08ba52654c7b67074f8985e17dabebbc99cf4d213a67a914fa429" Jan 31 09:43:04 crc kubenswrapper[4992]: E0131 09:43:04.700691 4992 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 31 09:43:04 crc kubenswrapper[4992]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-brjtx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-hx8p5_openstack(93164cd8-4706-4c83-b45f-a4a57d931b7d): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/93164cd8-4706-4c83-b45f-a4a57d931b7d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 31 09:43:04 crc kubenswrapper[4992]: > logger="UnhandledError" Jan 31 09:43:04 crc kubenswrapper[4992]: E0131 09:43:04.702578 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/93164cd8-4706-4c83-b45f-a4a57d931b7d/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" podUID="93164cd8-4706-4c83-b45f-a4a57d931b7d" Jan 31 09:43:04 crc kubenswrapper[4992]: I0131 09:43:04.951190 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 31 09:43:05 crc kubenswrapper[4992]: I0131 09:43:05.208873 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39503869-5316-4469-a57a-2fd445548c7e" path="/var/lib/kubelet/pods/39503869-5316-4469-a57a-2fd445548c7e/volumes" Jan 31 09:43:05 crc kubenswrapper[4992]: I0131 09:43:05.209305 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bf08a30-77ab-4371-bdaa-fa498b96136c" path="/var/lib/kubelet/pods/6bf08a30-77ab-4371-bdaa-fa498b96136c/volumes" Jan 31 09:43:05 crc kubenswrapper[4992]: I0131 09:43:05.247180 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 31 09:43:05 crc kubenswrapper[4992]: I0131 09:43:05.474643 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-c966m" event={"ID":"c7e2707b-6952-4179-8bc8-5d02c57126af","Type":"ContainerStarted","Data":"5313c0916eeaff1fc9565511932f5dc162439c7834b719206666368dbb8ab9f7"} Jan 31 09:43:05 crc kubenswrapper[4992]: I0131 09:43:05.510662 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-c966m" podStartSLOduration=9.711708257 podStartE2EDuration="19.510647053s" podCreationTimestamp="2026-01-31 09:42:46 +0000 UTC" firstStartedPulling="2026-01-31 09:42:53.504881538 +0000 UTC m=+1069.476273525" lastFinishedPulling="2026-01-31 09:43:03.303820334 +0000 UTC m=+1079.275212321" observedRunningTime="2026-01-31 09:43:05.504178835 +0000 UTC m=+1081.475570842" watchObservedRunningTime="2026-01-31 09:43:05.510647053 +0000 UTC m=+1081.482039040" Jan 31 09:43:06 crc kubenswrapper[4992]: W0131 09:43:06.121706 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6c51daf_48d3_4c48_b615_83aede3b27fa.slice/crio-50ba770784b1a2e4331e17a7be67cf867d7a58d6e4d7702e2421c225509c02fb WatchSource:0}: Error finding container 50ba770784b1a2e4331e17a7be67cf867d7a58d6e4d7702e2421c225509c02fb: Status 404 returned error can't find the container with id 50ba770784b1a2e4331e17a7be67cf867d7a58d6e4d7702e2421c225509c02fb Jan 31 09:43:06 crc kubenswrapper[4992]: I0131 09:43:06.482232 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a6c51daf-48d3-4c48-b615-83aede3b27fa","Type":"ContainerStarted","Data":"50ba770784b1a2e4331e17a7be67cf867d7a58d6e4d7702e2421c225509c02fb"} Jan 31 09:43:06 crc kubenswrapper[4992]: I0131 09:43:06.482495 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-c966m" Jan 31 09:43:06 crc kubenswrapper[4992]: W0131 09:43:06.735214 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd46da54e_1322_4b24_90f2_0929ae4711c2.slice/crio-58d7daba8c780954385a85c95faf34e58087e551f56bae2d2afaddd570176162 WatchSource:0}: Error finding container 58d7daba8c780954385a85c95faf34e58087e551f56bae2d2afaddd570176162: Status 404 returned error can't find the container with id 58d7daba8c780954385a85c95faf34e58087e551f56bae2d2afaddd570176162 Jan 31 09:43:07 crc kubenswrapper[4992]: I0131 09:43:07.488443 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d46da54e-1322-4b24-90f2-0929ae4711c2","Type":"ContainerStarted","Data":"58d7daba8c780954385a85c95faf34e58087e551f56bae2d2afaddd570176162"} Jan 31 09:43:11 crc kubenswrapper[4992]: I0131 09:43:11.521927 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" event={"ID":"93164cd8-4706-4c83-b45f-a4a57d931b7d","Type":"ContainerStarted","Data":"01d624a9851d10334508abe5cd6bedcebc87d211a8c8d5ceff89349a41bdf7cb"} Jan 31 09:43:11 crc kubenswrapper[4992]: I0131 09:43:11.522738 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" Jan 31 09:43:11 crc kubenswrapper[4992]: I0131 09:43:11.544872 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" podStartSLOduration=9.217771223 podStartE2EDuration="25.544856754s" podCreationTimestamp="2026-01-31 09:42:46 +0000 UTC" firstStartedPulling="2026-01-31 09:42:46.973675905 +0000 UTC m=+1062.945067892" lastFinishedPulling="2026-01-31 09:43:03.300761436 +0000 UTC m=+1079.272153423" observedRunningTime="2026-01-31 09:43:11.540673013 +0000 UTC m=+1087.512065010" watchObservedRunningTime="2026-01-31 09:43:11.544856754 +0000 UTC m=+1087.516248731" Jan 31 09:43:11 crc kubenswrapper[4992]: I0131 09:43:11.810627 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-c966m" Jan 31 09:43:11 crc kubenswrapper[4992]: I0131 09:43:11.856102 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hx8p5"] Jan 31 09:43:12 crc kubenswrapper[4992]: I0131 09:43:12.531090 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"29ab568d-b0c6-4420-b5fb-d027c4561e2f","Type":"ContainerStarted","Data":"561f1a9461caaf2a562d2515a983d8d8040640a5abfd20c94debf62a980044ef"} Jan 31 09:43:12 crc kubenswrapper[4992]: I0131 09:43:12.536283 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7e111be8-674c-4a0a-84b1-4a08fa98391f","Type":"ContainerStarted","Data":"fb5507ce4d27bb0b5b60ae402d4f16d1f77e1d91ea3d43b48716c5155c0a02cf"} Jan 31 09:43:12 crc kubenswrapper[4992]: I0131 09:43:12.572743 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=16.15934588 podStartE2EDuration="22.57272389s" podCreationTimestamp="2026-01-31 09:42:50 +0000 UTC" firstStartedPulling="2026-01-31 09:43:03.633065565 +0000 UTC m=+1079.604457552" lastFinishedPulling="2026-01-31 09:43:10.046443575 +0000 UTC m=+1086.017835562" observedRunningTime="2026-01-31 09:43:12.572068611 +0000 UTC m=+1088.543460628" watchObservedRunningTime="2026-01-31 09:43:12.57272389 +0000 UTC m=+1088.544115877" Jan 31 09:43:13 crc kubenswrapper[4992]: I0131 09:43:13.544647 4992 generic.go:334] "Generic (PLEG): container finished" podID="d5e4e817-6532-4d1d-85f8-649dae6babac" containerID="d049cb0129733245f8c8191a4fed84b2202a6f858ce4923865b2db5b1c23a1c5" exitCode=0 Jan 31 09:43:13 crc kubenswrapper[4992]: I0131 09:43:13.544768 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l45p8" event={"ID":"d5e4e817-6532-4d1d-85f8-649dae6babac","Type":"ContainerDied","Data":"d049cb0129733245f8c8191a4fed84b2202a6f858ce4923865b2db5b1c23a1c5"} Jan 31 09:43:13 crc kubenswrapper[4992]: I0131 09:43:13.546598 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vxtkq" event={"ID":"9aca01f7-a9ff-4d25-a330-c505e93a3cd0","Type":"ContainerStarted","Data":"0545c303ea0c06efc8d2ac561f6319100aa34bdff6308280b9f91fc65b89bb53"} Jan 31 09:43:13 crc kubenswrapper[4992]: I0131 09:43:13.546903 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-vxtkq" Jan 31 09:43:13 crc kubenswrapper[4992]: I0131 09:43:13.547942 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d46da54e-1322-4b24-90f2-0929ae4711c2","Type":"ContainerStarted","Data":"cfba9af7def5a20acd69006b767e037acdb283f2051e4538ee526c8bb90bfb8d"} Jan 31 09:43:13 crc kubenswrapper[4992]: I0131 09:43:13.550718 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8005e2e7-ed00-4af1-be65-12638ce3a9f9","Type":"ContainerStarted","Data":"7528d895a9359285fce439f82f52be161eb89351642e8d36a40b0419f286cfdc"} Jan 31 09:43:13 crc kubenswrapper[4992]: I0131 09:43:13.552569 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1f9647cf-fe89-42c1-bce1-2075d04e658e","Type":"ContainerStarted","Data":"98d32dbd53451f5d135262e1c0b2bd6cc0e8e0df1b42419bb8a5799e1aec476e"} Jan 31 09:43:13 crc kubenswrapper[4992]: I0131 09:43:13.552693 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 31 09:43:13 crc kubenswrapper[4992]: I0131 09:43:13.554701 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"71b7a97b-2d62-4b05-84f6-fc720ce9c672","Type":"ContainerStarted","Data":"d22d9ae6579988e4f2c265a9155d5f7266ad4c61c07fd18ab71ac6a17f9af9aa"} Jan 31 09:43:13 crc kubenswrapper[4992]: I0131 09:43:13.556531 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a6c51daf-48d3-4c48-b615-83aede3b27fa","Type":"ContainerStarted","Data":"b6701d932d7dfb1ea0cfef947fd029215b2bbe0252496f936d7d4c8566b11d85"} Jan 31 09:43:13 crc kubenswrapper[4992]: I0131 09:43:13.560928 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"71459842-ede4-4fcc-9e61-d884a02b341e","Type":"ContainerStarted","Data":"74f8585d6b0cd23350f635365479817dad93eb5e94252a7527829c62f840f234"} Jan 31 09:43:13 crc kubenswrapper[4992]: I0131 09:43:13.561191 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" podUID="93164cd8-4706-4c83-b45f-a4a57d931b7d" containerName="dnsmasq-dns" containerID="cri-o://01d624a9851d10334508abe5cd6bedcebc87d211a8c8d5ceff89349a41bdf7cb" gracePeriod=10 Jan 31 09:43:13 crc kubenswrapper[4992]: I0131 09:43:13.561261 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 31 09:43:13 crc kubenswrapper[4992]: I0131 09:43:13.591079 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vxtkq" podStartSLOduration=10.576597714 podStartE2EDuration="17.591062678s" podCreationTimestamp="2026-01-31 09:42:56 +0000 UTC" firstStartedPulling="2026-01-31 09:43:04.177633287 +0000 UTC m=+1080.149025284" lastFinishedPulling="2026-01-31 09:43:11.192098251 +0000 UTC m=+1087.163490248" observedRunningTime="2026-01-31 09:43:13.583823058 +0000 UTC m=+1089.555215045" watchObservedRunningTime="2026-01-31 09:43:13.591062678 +0000 UTC m=+1089.562454665" Jan 31 09:43:13 crc kubenswrapper[4992]: I0131 09:43:13.665181 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.630279677 podStartE2EDuration="21.66516259s" podCreationTimestamp="2026-01-31 09:42:52 +0000 UTC" firstStartedPulling="2026-01-31 09:43:04.177132003 +0000 UTC m=+1080.148523990" lastFinishedPulling="2026-01-31 09:43:12.212014916 +0000 UTC m=+1088.183406903" observedRunningTime="2026-01-31 09:43:13.63210295 +0000 UTC m=+1089.603494937" watchObservedRunningTime="2026-01-31 09:43:13.66516259 +0000 UTC m=+1089.636554577" Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.445169 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.570714 4992 generic.go:334] "Generic (PLEG): container finished" podID="93164cd8-4706-4c83-b45f-a4a57d931b7d" containerID="01d624a9851d10334508abe5cd6bedcebc87d211a8c8d5ceff89349a41bdf7cb" exitCode=0 Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.570772 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" event={"ID":"93164cd8-4706-4c83-b45f-a4a57d931b7d","Type":"ContainerDied","Data":"01d624a9851d10334508abe5cd6bedcebc87d211a8c8d5ceff89349a41bdf7cb"} Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.571121 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" event={"ID":"93164cd8-4706-4c83-b45f-a4a57d931b7d","Type":"ContainerDied","Data":"d363553e65f08ba52654c7b67074f8985e17dabebbc99cf4d213a67a914fa429"} Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.570787 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hx8p5" Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.571160 4992 scope.go:117] "RemoveContainer" containerID="01d624a9851d10334508abe5cd6bedcebc87d211a8c8d5ceff89349a41bdf7cb" Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.576067 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l45p8" event={"ID":"d5e4e817-6532-4d1d-85f8-649dae6babac","Type":"ContainerStarted","Data":"f73aba50d53a0b88a4f104a36748abf77169d7c35c04b10ed739d33a925229fd"} Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.586324 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93164cd8-4706-4c83-b45f-a4a57d931b7d-config\") pod \"93164cd8-4706-4c83-b45f-a4a57d931b7d\" (UID: \"93164cd8-4706-4c83-b45f-a4a57d931b7d\") " Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.586519 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brjtx\" (UniqueName: \"kubernetes.io/projected/93164cd8-4706-4c83-b45f-a4a57d931b7d-kube-api-access-brjtx\") pod \"93164cd8-4706-4c83-b45f-a4a57d931b7d\" (UID: \"93164cd8-4706-4c83-b45f-a4a57d931b7d\") " Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.586565 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93164cd8-4706-4c83-b45f-a4a57d931b7d-dns-svc\") pod \"93164cd8-4706-4c83-b45f-a4a57d931b7d\" (UID: \"93164cd8-4706-4c83-b45f-a4a57d931b7d\") " Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.588990 4992 scope.go:117] "RemoveContainer" containerID="e2c33fe90e390830dbef2664b2e24e5acb1b932208ec9a8f7b2259740a66b315" Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.590686 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93164cd8-4706-4c83-b45f-a4a57d931b7d-kube-api-access-brjtx" (OuterVolumeSpecName: "kube-api-access-brjtx") pod "93164cd8-4706-4c83-b45f-a4a57d931b7d" (UID: "93164cd8-4706-4c83-b45f-a4a57d931b7d"). InnerVolumeSpecName "kube-api-access-brjtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.609069 4992 scope.go:117] "RemoveContainer" containerID="01d624a9851d10334508abe5cd6bedcebc87d211a8c8d5ceff89349a41bdf7cb" Jan 31 09:43:14 crc kubenswrapper[4992]: E0131 09:43:14.609799 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01d624a9851d10334508abe5cd6bedcebc87d211a8c8d5ceff89349a41bdf7cb\": container with ID starting with 01d624a9851d10334508abe5cd6bedcebc87d211a8c8d5ceff89349a41bdf7cb not found: ID does not exist" containerID="01d624a9851d10334508abe5cd6bedcebc87d211a8c8d5ceff89349a41bdf7cb" Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.609843 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01d624a9851d10334508abe5cd6bedcebc87d211a8c8d5ceff89349a41bdf7cb"} err="failed to get container status \"01d624a9851d10334508abe5cd6bedcebc87d211a8c8d5ceff89349a41bdf7cb\": rpc error: code = NotFound desc = could not find container \"01d624a9851d10334508abe5cd6bedcebc87d211a8c8d5ceff89349a41bdf7cb\": container with ID starting with 01d624a9851d10334508abe5cd6bedcebc87d211a8c8d5ceff89349a41bdf7cb not found: ID does not exist" Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.609869 4992 scope.go:117] "RemoveContainer" containerID="e2c33fe90e390830dbef2664b2e24e5acb1b932208ec9a8f7b2259740a66b315" Jan 31 09:43:14 crc kubenswrapper[4992]: E0131 09:43:14.610165 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2c33fe90e390830dbef2664b2e24e5acb1b932208ec9a8f7b2259740a66b315\": container with ID starting with e2c33fe90e390830dbef2664b2e24e5acb1b932208ec9a8f7b2259740a66b315 not found: ID does not exist" containerID="e2c33fe90e390830dbef2664b2e24e5acb1b932208ec9a8f7b2259740a66b315" Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.610207 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2c33fe90e390830dbef2664b2e24e5acb1b932208ec9a8f7b2259740a66b315"} err="failed to get container status \"e2c33fe90e390830dbef2664b2e24e5acb1b932208ec9a8f7b2259740a66b315\": rpc error: code = NotFound desc = could not find container \"e2c33fe90e390830dbef2664b2e24e5acb1b932208ec9a8f7b2259740a66b315\": container with ID starting with e2c33fe90e390830dbef2664b2e24e5acb1b932208ec9a8f7b2259740a66b315 not found: ID does not exist" Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.622324 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93164cd8-4706-4c83-b45f-a4a57d931b7d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93164cd8-4706-4c83-b45f-a4a57d931b7d" (UID: "93164cd8-4706-4c83-b45f-a4a57d931b7d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.625527 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93164cd8-4706-4c83-b45f-a4a57d931b7d-config" (OuterVolumeSpecName: "config") pod "93164cd8-4706-4c83-b45f-a4a57d931b7d" (UID: "93164cd8-4706-4c83-b45f-a4a57d931b7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.690010 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brjtx\" (UniqueName: \"kubernetes.io/projected/93164cd8-4706-4c83-b45f-a4a57d931b7d-kube-api-access-brjtx\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.690053 4992 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93164cd8-4706-4c83-b45f-a4a57d931b7d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.690069 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93164cd8-4706-4c83-b45f-a4a57d931b7d-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.900299 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hx8p5"] Jan 31 09:43:14 crc kubenswrapper[4992]: I0131 09:43:14.906867 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hx8p5"] Jan 31 09:43:15 crc kubenswrapper[4992]: I0131 09:43:15.192963 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93164cd8-4706-4c83-b45f-a4a57d931b7d" path="/var/lib/kubelet/pods/93164cd8-4706-4c83-b45f-a4a57d931b7d/volumes" Jan 31 09:43:15 crc kubenswrapper[4992]: I0131 09:43:15.603295 4992 generic.go:334] "Generic (PLEG): container finished" podID="29ab568d-b0c6-4420-b5fb-d027c4561e2f" containerID="561f1a9461caaf2a562d2515a983d8d8040640a5abfd20c94debf62a980044ef" exitCode=0 Jan 31 09:43:15 crc kubenswrapper[4992]: I0131 09:43:15.603516 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"29ab568d-b0c6-4420-b5fb-d027c4561e2f","Type":"ContainerDied","Data":"561f1a9461caaf2a562d2515a983d8d8040640a5abfd20c94debf62a980044ef"} Jan 31 09:43:15 crc kubenswrapper[4992]: I0131 09:43:15.613057 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a6c51daf-48d3-4c48-b615-83aede3b27fa","Type":"ContainerStarted","Data":"e2817a47fa2101efb5469d797934fde936b7a643f92bcadad139c7e329daca55"} Jan 31 09:43:15 crc kubenswrapper[4992]: I0131 09:43:15.617588 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l45p8" event={"ID":"d5e4e817-6532-4d1d-85f8-649dae6babac","Type":"ContainerStarted","Data":"9c5df8fb6f6e7fe04e1bbadeefd2a0a3838810af4e1c00f9878b2cb17076f58f"} Jan 31 09:43:15 crc kubenswrapper[4992]: I0131 09:43:15.617977 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:43:15 crc kubenswrapper[4992]: I0131 09:43:15.618221 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:43:15 crc kubenswrapper[4992]: I0131 09:43:15.630178 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d46da54e-1322-4b24-90f2-0929ae4711c2","Type":"ContainerStarted","Data":"ca65c84b3c9deed1142655586967ad3a87467cb9ef73d90a0cf5a692338e9a3f"} Jan 31 09:43:15 crc kubenswrapper[4992]: I0131 09:43:15.684986 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.435976437 podStartE2EDuration="19.684966208s" podCreationTimestamp="2026-01-31 09:42:56 +0000 UTC" firstStartedPulling="2026-01-31 09:43:06.128632236 +0000 UTC m=+1082.100024223" lastFinishedPulling="2026-01-31 09:43:14.377622007 +0000 UTC m=+1090.349013994" observedRunningTime="2026-01-31 09:43:15.667094159 +0000 UTC m=+1091.638486166" watchObservedRunningTime="2026-01-31 09:43:15.684966208 +0000 UTC m=+1091.656358195" Jan 31 09:43:15 crc kubenswrapper[4992]: I0131 09:43:15.687490 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.069654297 podStartE2EDuration="18.687479341s" podCreationTimestamp="2026-01-31 09:42:57 +0000 UTC" firstStartedPulling="2026-01-31 09:43:06.765977752 +0000 UTC m=+1082.737369759" lastFinishedPulling="2026-01-31 09:43:14.383802816 +0000 UTC m=+1090.355194803" observedRunningTime="2026-01-31 09:43:15.682593609 +0000 UTC m=+1091.653985606" watchObservedRunningTime="2026-01-31 09:43:15.687479341 +0000 UTC m=+1091.658871328" Jan 31 09:43:15 crc kubenswrapper[4992]: I0131 09:43:15.715632 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-l45p8" podStartSLOduration=12.912632455 podStartE2EDuration="19.715609778s" podCreationTimestamp="2026-01-31 09:42:56 +0000 UTC" firstStartedPulling="2026-01-31 09:43:04.389705905 +0000 UTC m=+1080.361097892" lastFinishedPulling="2026-01-31 09:43:11.192683228 +0000 UTC m=+1087.164075215" observedRunningTime="2026-01-31 09:43:15.708258754 +0000 UTC m=+1091.679650761" watchObservedRunningTime="2026-01-31 09:43:15.715609778 +0000 UTC m=+1091.687001785" Jan 31 09:43:15 crc kubenswrapper[4992]: I0131 09:43:15.943570 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 31 09:43:16 crc kubenswrapper[4992]: I0131 09:43:16.006407 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 31 09:43:16 crc kubenswrapper[4992]: I0131 09:43:16.644012 4992 generic.go:334] "Generic (PLEG): container finished" podID="71459842-ede4-4fcc-9e61-d884a02b341e" containerID="74f8585d6b0cd23350f635365479817dad93eb5e94252a7527829c62f840f234" exitCode=0 Jan 31 09:43:16 crc kubenswrapper[4992]: I0131 09:43:16.644325 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"71459842-ede4-4fcc-9e61-d884a02b341e","Type":"ContainerDied","Data":"74f8585d6b0cd23350f635365479817dad93eb5e94252a7527829c62f840f234"} Jan 31 09:43:16 crc kubenswrapper[4992]: I0131 09:43:16.647504 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"29ab568d-b0c6-4420-b5fb-d027c4561e2f","Type":"ContainerStarted","Data":"f54e5ee6edbb021aa10b2a0e2e3e7c9514f2cf4bd1699460f18e9bf334d7038d"} Jan 31 09:43:16 crc kubenswrapper[4992]: I0131 09:43:16.647942 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 31 09:43:16 crc kubenswrapper[4992]: I0131 09:43:16.697540 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=23.849213516 podStartE2EDuration="29.697521579s" podCreationTimestamp="2026-01-31 09:42:47 +0000 UTC" firstStartedPulling="2026-01-31 09:43:04.215939349 +0000 UTC m=+1080.187331336" lastFinishedPulling="2026-01-31 09:43:10.064247402 +0000 UTC m=+1086.035639399" observedRunningTime="2026-01-31 09:43:16.693914155 +0000 UTC m=+1092.665306152" watchObservedRunningTime="2026-01-31 09:43:16.697521579 +0000 UTC m=+1092.668913566" Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.272585 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.340684 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.665131 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"71459842-ede4-4fcc-9e61-d884a02b341e","Type":"ContainerStarted","Data":"3dbfb326b3c777dd435c5814aa638f54dcb73f36f0cbf4775d2442945c468308"} Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.665662 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.693699 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.545315252 podStartE2EDuration="29.693682544s" podCreationTimestamp="2026-01-31 09:42:48 +0000 UTC" firstStartedPulling="2026-01-31 09:43:03.906474674 +0000 UTC m=+1079.877866681" lastFinishedPulling="2026-01-31 09:43:11.054841946 +0000 UTC m=+1087.026233973" observedRunningTime="2026-01-31 09:43:17.689491312 +0000 UTC m=+1093.660883389" watchObservedRunningTime="2026-01-31 09:43:17.693682544 +0000 UTC m=+1093.665074531" Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.720668 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.727791 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.908063 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-5r2p2"] Jan 31 09:43:17 crc kubenswrapper[4992]: E0131 09:43:17.908735 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93164cd8-4706-4c83-b45f-a4a57d931b7d" containerName="init" Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.908757 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="93164cd8-4706-4c83-b45f-a4a57d931b7d" containerName="init" Jan 31 09:43:17 crc kubenswrapper[4992]: E0131 09:43:17.908791 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93164cd8-4706-4c83-b45f-a4a57d931b7d" containerName="dnsmasq-dns" Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.908800 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="93164cd8-4706-4c83-b45f-a4a57d931b7d" containerName="dnsmasq-dns" Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.908988 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="93164cd8-4706-4c83-b45f-a4a57d931b7d" containerName="dnsmasq-dns" Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.909981 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.912983 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.921314 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-5r2p2"] Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.987163 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-xhfg8"] Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.990567 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:17 crc kubenswrapper[4992]: I0131 09:43:17.995231 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.018998 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xhfg8"] Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.050439 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-5r2p2\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.050491 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-config\") pod \"dnsmasq-dns-5bf47b49b7-5r2p2\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.050537 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-5r2p2\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.050671 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92074e4-8400-4bd5-9135-4b25d46d4607-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.050735 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f92074e4-8400-4bd5-9135-4b25d46d4607-ovs-rundir\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.050832 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7tsx\" (UniqueName: \"kubernetes.io/projected/11cac247-5a31-4390-8b2d-bc23ed6e1220-kube-api-access-j7tsx\") pod \"dnsmasq-dns-5bf47b49b7-5r2p2\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.050892 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f92074e4-8400-4bd5-9135-4b25d46d4607-ovn-rundir\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.050947 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f92074e4-8400-4bd5-9135-4b25d46d4607-config\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.051011 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bkr7\" (UniqueName: \"kubernetes.io/projected/f92074e4-8400-4bd5-9135-4b25d46d4607-kube-api-access-4bkr7\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.051077 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92074e4-8400-4bd5-9135-4b25d46d4607-combined-ca-bundle\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.086094 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-5r2p2"] Jan 31 09:43:18 crc kubenswrapper[4992]: E0131 09:43:18.086690 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-j7tsx ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" podUID="11cac247-5a31-4390-8b2d-bc23ed6e1220" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.108131 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-gwvqv"] Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.111075 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.112944 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.122500 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-gwvqv"] Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.153902 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-5r2p2\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.153955 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-config\") pod \"dnsmasq-dns-5bf47b49b7-5r2p2\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.153990 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49t8d\" (UniqueName: \"kubernetes.io/projected/71b91424-3380-42f6-a2a3-edcb31b2eee2-kube-api-access-49t8d\") pod \"dnsmasq-dns-8554648995-gwvqv\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.154018 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-gwvqv\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.154056 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-5r2p2\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.154094 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92074e4-8400-4bd5-9135-4b25d46d4607-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.154124 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f92074e4-8400-4bd5-9135-4b25d46d4607-ovs-rundir\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.154165 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7tsx\" (UniqueName: \"kubernetes.io/projected/11cac247-5a31-4390-8b2d-bc23ed6e1220-kube-api-access-j7tsx\") pod \"dnsmasq-dns-5bf47b49b7-5r2p2\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.154210 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f92074e4-8400-4bd5-9135-4b25d46d4607-ovn-rundir\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.154239 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f92074e4-8400-4bd5-9135-4b25d46d4607-config\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.154279 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-dns-svc\") pod \"dnsmasq-dns-8554648995-gwvqv\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.154306 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bkr7\" (UniqueName: \"kubernetes.io/projected/f92074e4-8400-4bd5-9135-4b25d46d4607-kube-api-access-4bkr7\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.154345 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-gwvqv\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.154373 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92074e4-8400-4bd5-9135-4b25d46d4607-combined-ca-bundle\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.154408 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-config\") pod \"dnsmasq-dns-8554648995-gwvqv\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.155127 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f92074e4-8400-4bd5-9135-4b25d46d4607-ovn-rundir\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.156115 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f92074e4-8400-4bd5-9135-4b25d46d4607-config\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.157098 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-5r2p2\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.157706 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f92074e4-8400-4bd5-9135-4b25d46d4607-ovs-rundir\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.158407 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-5r2p2\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.158757 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-config\") pod \"dnsmasq-dns-5bf47b49b7-5r2p2\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.161238 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92074e4-8400-4bd5-9135-4b25d46d4607-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.174318 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7tsx\" (UniqueName: \"kubernetes.io/projected/11cac247-5a31-4390-8b2d-bc23ed6e1220-kube-api-access-j7tsx\") pod \"dnsmasq-dns-5bf47b49b7-5r2p2\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.179432 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92074e4-8400-4bd5-9135-4b25d46d4607-combined-ca-bundle\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.180917 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bkr7\" (UniqueName: \"kubernetes.io/projected/f92074e4-8400-4bd5-9135-4b25d46d4607-kube-api-access-4bkr7\") pod \"ovn-controller-metrics-xhfg8\" (UID: \"f92074e4-8400-4bd5-9135-4b25d46d4607\") " pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.216968 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.224266 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.229442 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.229767 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-9dtp6" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.229957 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.230176 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.234446 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.255003 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d20275ed-25c4-42ca-9d6e-fc909f6844fb-config\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.255056 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdxxp\" (UniqueName: \"kubernetes.io/projected/d20275ed-25c4-42ca-9d6e-fc909f6844fb-kube-api-access-mdxxp\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.255080 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-dns-svc\") pod \"dnsmasq-dns-8554648995-gwvqv\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.255108 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d20275ed-25c4-42ca-9d6e-fc909f6844fb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.255128 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-gwvqv\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.255156 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d20275ed-25c4-42ca-9d6e-fc909f6844fb-scripts\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.255176 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-config\") pod \"dnsmasq-dns-8554648995-gwvqv\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.255190 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20275ed-25c4-42ca-9d6e-fc909f6844fb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.255231 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-gwvqv\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.255459 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49t8d\" (UniqueName: \"kubernetes.io/projected/71b91424-3380-42f6-a2a3-edcb31b2eee2-kube-api-access-49t8d\") pod \"dnsmasq-dns-8554648995-gwvqv\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.255522 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d20275ed-25c4-42ca-9d6e-fc909f6844fb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.255539 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d20275ed-25c4-42ca-9d6e-fc909f6844fb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.256325 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-dns-svc\") pod \"dnsmasq-dns-8554648995-gwvqv\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.256978 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-gwvqv\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.257844 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-config\") pod \"dnsmasq-dns-8554648995-gwvqv\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.258300 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-gwvqv\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.274581 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49t8d\" (UniqueName: \"kubernetes.io/projected/71b91424-3380-42f6-a2a3-edcb31b2eee2-kube-api-access-49t8d\") pod \"dnsmasq-dns-8554648995-gwvqv\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.311358 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xhfg8" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.356744 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d20275ed-25c4-42ca-9d6e-fc909f6844fb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.356794 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d20275ed-25c4-42ca-9d6e-fc909f6844fb-scripts\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.356818 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20275ed-25c4-42ca-9d6e-fc909f6844fb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.356889 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d20275ed-25c4-42ca-9d6e-fc909f6844fb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.356904 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d20275ed-25c4-42ca-9d6e-fc909f6844fb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.356926 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d20275ed-25c4-42ca-9d6e-fc909f6844fb-config\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.356958 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdxxp\" (UniqueName: \"kubernetes.io/projected/d20275ed-25c4-42ca-9d6e-fc909f6844fb-kube-api-access-mdxxp\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.357844 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d20275ed-25c4-42ca-9d6e-fc909f6844fb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.357915 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d20275ed-25c4-42ca-9d6e-fc909f6844fb-scripts\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.358208 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d20275ed-25c4-42ca-9d6e-fc909f6844fb-config\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.362233 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d20275ed-25c4-42ca-9d6e-fc909f6844fb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.363186 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d20275ed-25c4-42ca-9d6e-fc909f6844fb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.364720 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d20275ed-25c4-42ca-9d6e-fc909f6844fb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.374932 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdxxp\" (UniqueName: \"kubernetes.io/projected/d20275ed-25c4-42ca-9d6e-fc909f6844fb-kube-api-access-mdxxp\") pod \"ovn-northd-0\" (UID: \"d20275ed-25c4-42ca-9d6e-fc909f6844fb\") " pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.431097 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.546410 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.675271 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.686540 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.742336 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xhfg8"] Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.865853 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-dns-svc\") pod \"11cac247-5a31-4390-8b2d-bc23ed6e1220\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.865939 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-ovsdbserver-nb\") pod \"11cac247-5a31-4390-8b2d-bc23ed6e1220\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.866006 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-config\") pod \"11cac247-5a31-4390-8b2d-bc23ed6e1220\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.866038 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7tsx\" (UniqueName: \"kubernetes.io/projected/11cac247-5a31-4390-8b2d-bc23ed6e1220-kube-api-access-j7tsx\") pod \"11cac247-5a31-4390-8b2d-bc23ed6e1220\" (UID: \"11cac247-5a31-4390-8b2d-bc23ed6e1220\") " Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.866247 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-gwvqv"] Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.867162 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-config" (OuterVolumeSpecName: "config") pod "11cac247-5a31-4390-8b2d-bc23ed6e1220" (UID: "11cac247-5a31-4390-8b2d-bc23ed6e1220"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.867998 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "11cac247-5a31-4390-8b2d-bc23ed6e1220" (UID: "11cac247-5a31-4390-8b2d-bc23ed6e1220"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.868193 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "11cac247-5a31-4390-8b2d-bc23ed6e1220" (UID: "11cac247-5a31-4390-8b2d-bc23ed6e1220"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.871028 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11cac247-5a31-4390-8b2d-bc23ed6e1220-kube-api-access-j7tsx" (OuterVolumeSpecName: "kube-api-access-j7tsx") pod "11cac247-5a31-4390-8b2d-bc23ed6e1220" (UID: "11cac247-5a31-4390-8b2d-bc23ed6e1220"). InnerVolumeSpecName "kube-api-access-j7tsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.969164 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7tsx\" (UniqueName: \"kubernetes.io/projected/11cac247-5a31-4390-8b2d-bc23ed6e1220-kube-api-access-j7tsx\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.969502 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.969544 4992 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.969558 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11cac247-5a31-4390-8b2d-bc23ed6e1220-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:18 crc kubenswrapper[4992]: I0131 09:43:18.993746 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 31 09:43:18 crc kubenswrapper[4992]: W0131 09:43:18.999703 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd20275ed_25c4_42ca_9d6e_fc909f6844fb.slice/crio-a443909237f999ba659c58d0cf8f05c8530abb455e0906da304d5631c494d10a WatchSource:0}: Error finding container a443909237f999ba659c58d0cf8f05c8530abb455e0906da304d5631c494d10a: Status 404 returned error can't find the container with id a443909237f999ba659c58d0cf8f05c8530abb455e0906da304d5631c494d10a Jan 31 09:43:19 crc kubenswrapper[4992]: I0131 09:43:19.038797 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 31 09:43:19 crc kubenswrapper[4992]: I0131 09:43:19.038879 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 31 09:43:19 crc kubenswrapper[4992]: I0131 09:43:19.684511 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d20275ed-25c4-42ca-9d6e-fc909f6844fb","Type":"ContainerStarted","Data":"a443909237f999ba659c58d0cf8f05c8530abb455e0906da304d5631c494d10a"} Jan 31 09:43:19 crc kubenswrapper[4992]: I0131 09:43:19.685985 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xhfg8" event={"ID":"f92074e4-8400-4bd5-9135-4b25d46d4607","Type":"ContainerStarted","Data":"8b4a670574dab8274cac2171f19b79d68a68c32023dbd6ee12a9b588f5706756"} Jan 31 09:43:19 crc kubenswrapper[4992]: I0131 09:43:19.686011 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xhfg8" event={"ID":"f92074e4-8400-4bd5-9135-4b25d46d4607","Type":"ContainerStarted","Data":"978ca4626223ac3b0bcdccf4677530d3743805110acc980236909417658cc6aa"} Jan 31 09:43:19 crc kubenswrapper[4992]: I0131 09:43:19.687720 4992 generic.go:334] "Generic (PLEG): container finished" podID="71b91424-3380-42f6-a2a3-edcb31b2eee2" containerID="0eaa5f1b2fc1e5e9a5dd192d57deec6fa5a68e863110ed44bbf435c50ddaf5d0" exitCode=0 Jan 31 09:43:19 crc kubenswrapper[4992]: I0131 09:43:19.687745 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-gwvqv" event={"ID":"71b91424-3380-42f6-a2a3-edcb31b2eee2","Type":"ContainerDied","Data":"0eaa5f1b2fc1e5e9a5dd192d57deec6fa5a68e863110ed44bbf435c50ddaf5d0"} Jan 31 09:43:19 crc kubenswrapper[4992]: I0131 09:43:19.687769 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-gwvqv" event={"ID":"71b91424-3380-42f6-a2a3-edcb31b2eee2","Type":"ContainerStarted","Data":"f8b849c3e0338d516f6b5d2adae42adb5435a516649c19236f07d7bfc0da7c16"} Jan 31 09:43:19 crc kubenswrapper[4992]: I0131 09:43:19.687807 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-5r2p2" Jan 31 09:43:19 crc kubenswrapper[4992]: I0131 09:43:19.717313 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-xhfg8" podStartSLOduration=2.717292333 podStartE2EDuration="2.717292333s" podCreationTimestamp="2026-01-31 09:43:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:43:19.713344308 +0000 UTC m=+1095.684736305" watchObservedRunningTime="2026-01-31 09:43:19.717292333 +0000 UTC m=+1095.688684340" Jan 31 09:43:19 crc kubenswrapper[4992]: I0131 09:43:19.798526 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-5r2p2"] Jan 31 09:43:19 crc kubenswrapper[4992]: I0131 09:43:19.819887 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-5r2p2"] Jan 31 09:43:20 crc kubenswrapper[4992]: I0131 09:43:20.343090 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 31 09:43:20 crc kubenswrapper[4992]: I0131 09:43:20.343175 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 31 09:43:20 crc kubenswrapper[4992]: I0131 09:43:20.490067 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 31 09:43:20 crc kubenswrapper[4992]: I0131 09:43:20.696062 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d20275ed-25c4-42ca-9d6e-fc909f6844fb","Type":"ContainerStarted","Data":"e69b42bb24a668981154be57377c59e372d0f1fd3292bec6020178dd92f5c76b"} Jan 31 09:43:20 crc kubenswrapper[4992]: I0131 09:43:20.698086 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-gwvqv" event={"ID":"71b91424-3380-42f6-a2a3-edcb31b2eee2","Type":"ContainerStarted","Data":"4e0af4a41c8debfa41dd5f7c5e3f6445b561fd01ecb9c6be9daa74e4832aeb73"} Jan 31 09:43:21 crc kubenswrapper[4992]: I0131 09:43:21.193087 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11cac247-5a31-4390-8b2d-bc23ed6e1220" path="/var/lib/kubelet/pods/11cac247-5a31-4390-8b2d-bc23ed6e1220/volumes" Jan 31 09:43:21 crc kubenswrapper[4992]: I0131 09:43:21.713839 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d20275ed-25c4-42ca-9d6e-fc909f6844fb","Type":"ContainerStarted","Data":"19bc79d1f80477a326fb30639c938735252bf10198c84b12b1bb675fd83446cf"} Jan 31 09:43:21 crc kubenswrapper[4992]: I0131 09:43:21.714354 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:21 crc kubenswrapper[4992]: I0131 09:43:21.763172 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-gwvqv" podStartSLOduration=3.763152547 podStartE2EDuration="3.763152547s" podCreationTimestamp="2026-01-31 09:43:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:43:20.723925111 +0000 UTC m=+1096.695317108" watchObservedRunningTime="2026-01-31 09:43:21.763152547 +0000 UTC m=+1097.734544534" Jan 31 09:43:21 crc kubenswrapper[4992]: I0131 09:43:21.770492 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.411793018 podStartE2EDuration="3.770468029s" podCreationTimestamp="2026-01-31 09:43:18 +0000 UTC" firstStartedPulling="2026-01-31 09:43:19.001834408 +0000 UTC m=+1094.973226395" lastFinishedPulling="2026-01-31 09:43:20.360509419 +0000 UTC m=+1096.331901406" observedRunningTime="2026-01-31 09:43:21.759385127 +0000 UTC m=+1097.730777124" watchObservedRunningTime="2026-01-31 09:43:21.770468029 +0000 UTC m=+1097.741860016" Jan 31 09:43:22 crc kubenswrapper[4992]: I0131 09:43:22.016338 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 31 09:43:22 crc kubenswrapper[4992]: I0131 09:43:22.111445 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 31 09:43:22 crc kubenswrapper[4992]: I0131 09:43:22.607908 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 31 09:43:22 crc kubenswrapper[4992]: I0131 09:43:22.721726 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:27.813486 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-5wnkp"] Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:27.816870 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5wnkp" Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:27.819758 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:27.822332 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5wnkp"] Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:27.911084 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6wrl\" (UniqueName: \"kubernetes.io/projected/b8373451-6f50-4db4-9034-e846de3b5d79-kube-api-access-j6wrl\") pod \"root-account-create-update-5wnkp\" (UID: \"b8373451-6f50-4db4-9034-e846de3b5d79\") " pod="openstack/root-account-create-update-5wnkp" Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:27.911159 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8373451-6f50-4db4-9034-e846de3b5d79-operator-scripts\") pod \"root-account-create-update-5wnkp\" (UID: \"b8373451-6f50-4db4-9034-e846de3b5d79\") " pod="openstack/root-account-create-update-5wnkp" Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:28.013020 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8373451-6f50-4db4-9034-e846de3b5d79-operator-scripts\") pod \"root-account-create-update-5wnkp\" (UID: \"b8373451-6f50-4db4-9034-e846de3b5d79\") " pod="openstack/root-account-create-update-5wnkp" Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:28.013269 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6wrl\" (UniqueName: \"kubernetes.io/projected/b8373451-6f50-4db4-9034-e846de3b5d79-kube-api-access-j6wrl\") pod \"root-account-create-update-5wnkp\" (UID: \"b8373451-6f50-4db4-9034-e846de3b5d79\") " pod="openstack/root-account-create-update-5wnkp" Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:28.014345 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8373451-6f50-4db4-9034-e846de3b5d79-operator-scripts\") pod \"root-account-create-update-5wnkp\" (UID: \"b8373451-6f50-4db4-9034-e846de3b5d79\") " pod="openstack/root-account-create-update-5wnkp" Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:28.034078 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6wrl\" (UniqueName: \"kubernetes.io/projected/b8373451-6f50-4db4-9034-e846de3b5d79-kube-api-access-j6wrl\") pod \"root-account-create-update-5wnkp\" (UID: \"b8373451-6f50-4db4-9034-e846de3b5d79\") " pod="openstack/root-account-create-update-5wnkp" Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:28.139024 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5wnkp" Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:28.432621 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:28.477298 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-c966m"] Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:28.477544 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-c966m" podUID="c7e2707b-6952-4179-8bc8-5d02c57126af" containerName="dnsmasq-dns" containerID="cri-o://5313c0916eeaff1fc9565511932f5dc162439c7834b719206666368dbb8ab9f7" gracePeriod=10 Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:28.635594 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5wnkp"] Jan 31 09:43:28 crc kubenswrapper[4992]: W0131 09:43:28.641702 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8373451_6f50_4db4_9034_e846de3b5d79.slice/crio-3d46dc60be1762dc79a41b8e5fd90b9ce9b8a348fabc398b5f16a9a268caf2f4 WatchSource:0}: Error finding container 3d46dc60be1762dc79a41b8e5fd90b9ce9b8a348fabc398b5f16a9a268caf2f4: Status 404 returned error can't find the container with id 3d46dc60be1762dc79a41b8e5fd90b9ce9b8a348fabc398b5f16a9a268caf2f4 Jan 31 09:43:28 crc kubenswrapper[4992]: I0131 09:43:28.762722 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5wnkp" event={"ID":"b8373451-6f50-4db4-9034-e846de3b5d79","Type":"ContainerStarted","Data":"3d46dc60be1762dc79a41b8e5fd90b9ce9b8a348fabc398b5f16a9a268caf2f4"} Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.495645 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-c966m" Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.554997 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e2707b-6952-4179-8bc8-5d02c57126af-config\") pod \"c7e2707b-6952-4179-8bc8-5d02c57126af\" (UID: \"c7e2707b-6952-4179-8bc8-5d02c57126af\") " Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.555077 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzdq4\" (UniqueName: \"kubernetes.io/projected/c7e2707b-6952-4179-8bc8-5d02c57126af-kube-api-access-tzdq4\") pod \"c7e2707b-6952-4179-8bc8-5d02c57126af\" (UID: \"c7e2707b-6952-4179-8bc8-5d02c57126af\") " Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.555149 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7e2707b-6952-4179-8bc8-5d02c57126af-dns-svc\") pod \"c7e2707b-6952-4179-8bc8-5d02c57126af\" (UID: \"c7e2707b-6952-4179-8bc8-5d02c57126af\") " Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.563607 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e2707b-6952-4179-8bc8-5d02c57126af-kube-api-access-tzdq4" (OuterVolumeSpecName: "kube-api-access-tzdq4") pod "c7e2707b-6952-4179-8bc8-5d02c57126af" (UID: "c7e2707b-6952-4179-8bc8-5d02c57126af"). InnerVolumeSpecName "kube-api-access-tzdq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.602138 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7e2707b-6952-4179-8bc8-5d02c57126af-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7e2707b-6952-4179-8bc8-5d02c57126af" (UID: "c7e2707b-6952-4179-8bc8-5d02c57126af"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.603913 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7e2707b-6952-4179-8bc8-5d02c57126af-config" (OuterVolumeSpecName: "config") pod "c7e2707b-6952-4179-8bc8-5d02c57126af" (UID: "c7e2707b-6952-4179-8bc8-5d02c57126af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.657183 4992 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7e2707b-6952-4179-8bc8-5d02c57126af-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.657215 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7e2707b-6952-4179-8bc8-5d02c57126af-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.657224 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzdq4\" (UniqueName: \"kubernetes.io/projected/c7e2707b-6952-4179-8bc8-5d02c57126af-kube-api-access-tzdq4\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.770665 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5wnkp" event={"ID":"b8373451-6f50-4db4-9034-e846de3b5d79","Type":"ContainerStarted","Data":"513b36f53645297f742259b45d5286c8090d81f9e11fe4bfb5dfb00eadf6eec0"} Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.772551 4992 generic.go:334] "Generic (PLEG): container finished" podID="c7e2707b-6952-4179-8bc8-5d02c57126af" containerID="5313c0916eeaff1fc9565511932f5dc162439c7834b719206666368dbb8ab9f7" exitCode=0 Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.772585 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-c966m" event={"ID":"c7e2707b-6952-4179-8bc8-5d02c57126af","Type":"ContainerDied","Data":"5313c0916eeaff1fc9565511932f5dc162439c7834b719206666368dbb8ab9f7"} Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.772612 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-c966m" Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.772627 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-c966m" event={"ID":"c7e2707b-6952-4179-8bc8-5d02c57126af","Type":"ContainerDied","Data":"df6572e2dcb41489a3e45b061ce21c4a6b2b656d4320ffcd65ac71cac20286af"} Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.772649 4992 scope.go:117] "RemoveContainer" containerID="5313c0916eeaff1fc9565511932f5dc162439c7834b719206666368dbb8ab9f7" Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.796461 4992 scope.go:117] "RemoveContainer" containerID="ac42dbe37165c33365d2c1fb166f3b986626db8f125c721e733fb9e3bfa49a24" Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.797729 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-5wnkp" podStartSLOduration=2.797711714 podStartE2EDuration="2.797711714s" podCreationTimestamp="2026-01-31 09:43:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:43:29.789463116 +0000 UTC m=+1105.760855143" watchObservedRunningTime="2026-01-31 09:43:29.797711714 +0000 UTC m=+1105.769103701" Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.810454 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-c966m"] Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.812374 4992 scope.go:117] "RemoveContainer" containerID="5313c0916eeaff1fc9565511932f5dc162439c7834b719206666368dbb8ab9f7" Jan 31 09:43:29 crc kubenswrapper[4992]: E0131 09:43:29.812915 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5313c0916eeaff1fc9565511932f5dc162439c7834b719206666368dbb8ab9f7\": container with ID starting with 5313c0916eeaff1fc9565511932f5dc162439c7834b719206666368dbb8ab9f7 not found: ID does not exist" containerID="5313c0916eeaff1fc9565511932f5dc162439c7834b719206666368dbb8ab9f7" Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.812951 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5313c0916eeaff1fc9565511932f5dc162439c7834b719206666368dbb8ab9f7"} err="failed to get container status \"5313c0916eeaff1fc9565511932f5dc162439c7834b719206666368dbb8ab9f7\": rpc error: code = NotFound desc = could not find container \"5313c0916eeaff1fc9565511932f5dc162439c7834b719206666368dbb8ab9f7\": container with ID starting with 5313c0916eeaff1fc9565511932f5dc162439c7834b719206666368dbb8ab9f7 not found: ID does not exist" Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.812974 4992 scope.go:117] "RemoveContainer" containerID="ac42dbe37165c33365d2c1fb166f3b986626db8f125c721e733fb9e3bfa49a24" Jan 31 09:43:29 crc kubenswrapper[4992]: E0131 09:43:29.813312 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac42dbe37165c33365d2c1fb166f3b986626db8f125c721e733fb9e3bfa49a24\": container with ID starting with ac42dbe37165c33365d2c1fb166f3b986626db8f125c721e733fb9e3bfa49a24 not found: ID does not exist" containerID="ac42dbe37165c33365d2c1fb166f3b986626db8f125c721e733fb9e3bfa49a24" Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.813357 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac42dbe37165c33365d2c1fb166f3b986626db8f125c721e733fb9e3bfa49a24"} err="failed to get container status \"ac42dbe37165c33365d2c1fb166f3b986626db8f125c721e733fb9e3bfa49a24\": rpc error: code = NotFound desc = could not find container \"ac42dbe37165c33365d2c1fb166f3b986626db8f125c721e733fb9e3bfa49a24\": container with ID starting with ac42dbe37165c33365d2c1fb166f3b986626db8f125c721e733fb9e3bfa49a24 not found: ID does not exist" Jan 31 09:43:29 crc kubenswrapper[4992]: I0131 09:43:29.818725 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-c966m"] Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.130007 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mq26l"] Jan 31 09:43:30 crc kubenswrapper[4992]: E0131 09:43:30.130325 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e2707b-6952-4179-8bc8-5d02c57126af" containerName="init" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.130341 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e2707b-6952-4179-8bc8-5d02c57126af" containerName="init" Jan 31 09:43:30 crc kubenswrapper[4992]: E0131 09:43:30.130371 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e2707b-6952-4179-8bc8-5d02c57126af" containerName="dnsmasq-dns" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.130378 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e2707b-6952-4179-8bc8-5d02c57126af" containerName="dnsmasq-dns" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.130526 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e2707b-6952-4179-8bc8-5d02c57126af" containerName="dnsmasq-dns" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.131007 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mq26l" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.137038 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mq26l"] Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.272199 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7d73-account-create-update-tl9tj"] Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.273284 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdgqk\" (UniqueName: \"kubernetes.io/projected/2ceda8d2-7468-42f8-beb8-bf2ac95bea0b-kube-api-access-mdgqk\") pod \"keystone-db-create-mq26l\" (UID: \"2ceda8d2-7468-42f8-beb8-bf2ac95bea0b\") " pod="openstack/keystone-db-create-mq26l" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.273773 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ceda8d2-7468-42f8-beb8-bf2ac95bea0b-operator-scripts\") pod \"keystone-db-create-mq26l\" (UID: \"2ceda8d2-7468-42f8-beb8-bf2ac95bea0b\") " pod="openstack/keystone-db-create-mq26l" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.276538 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d73-account-create-update-tl9tj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.279939 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.284436 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d73-account-create-update-tl9tj"] Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.375905 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdgqk\" (UniqueName: \"kubernetes.io/projected/2ceda8d2-7468-42f8-beb8-bf2ac95bea0b-kube-api-access-mdgqk\") pod \"keystone-db-create-mq26l\" (UID: \"2ceda8d2-7468-42f8-beb8-bf2ac95bea0b\") " pod="openstack/keystone-db-create-mq26l" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.376411 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwgjg\" (UniqueName: \"kubernetes.io/projected/579e533f-ff8a-477d-b7ca-99835fec403c-kube-api-access-nwgjg\") pod \"keystone-7d73-account-create-update-tl9tj\" (UID: \"579e533f-ff8a-477d-b7ca-99835fec403c\") " pod="openstack/keystone-7d73-account-create-update-tl9tj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.376619 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ceda8d2-7468-42f8-beb8-bf2ac95bea0b-operator-scripts\") pod \"keystone-db-create-mq26l\" (UID: \"2ceda8d2-7468-42f8-beb8-bf2ac95bea0b\") " pod="openstack/keystone-db-create-mq26l" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.376741 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/579e533f-ff8a-477d-b7ca-99835fec403c-operator-scripts\") pod \"keystone-7d73-account-create-update-tl9tj\" (UID: \"579e533f-ff8a-477d-b7ca-99835fec403c\") " pod="openstack/keystone-7d73-account-create-update-tl9tj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.377694 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ceda8d2-7468-42f8-beb8-bf2ac95bea0b-operator-scripts\") pod \"keystone-db-create-mq26l\" (UID: \"2ceda8d2-7468-42f8-beb8-bf2ac95bea0b\") " pod="openstack/keystone-db-create-mq26l" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.395445 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdgqk\" (UniqueName: \"kubernetes.io/projected/2ceda8d2-7468-42f8-beb8-bf2ac95bea0b-kube-api-access-mdgqk\") pod \"keystone-db-create-mq26l\" (UID: \"2ceda8d2-7468-42f8-beb8-bf2ac95bea0b\") " pod="openstack/keystone-db-create-mq26l" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.446875 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mq26l" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.472389 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-n2gjj"] Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.473850 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n2gjj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.478500 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c240-account-create-update-jfkt9"] Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.479547 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/579e533f-ff8a-477d-b7ca-99835fec403c-operator-scripts\") pod \"keystone-7d73-account-create-update-tl9tj\" (UID: \"579e533f-ff8a-477d-b7ca-99835fec403c\") " pod="openstack/keystone-7d73-account-create-update-tl9tj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.478556 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/579e533f-ff8a-477d-b7ca-99835fec403c-operator-scripts\") pod \"keystone-7d73-account-create-update-tl9tj\" (UID: \"579e533f-ff8a-477d-b7ca-99835fec403c\") " pod="openstack/keystone-7d73-account-create-update-tl9tj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.479861 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwgjg\" (UniqueName: \"kubernetes.io/projected/579e533f-ff8a-477d-b7ca-99835fec403c-kube-api-access-nwgjg\") pod \"keystone-7d73-account-create-update-tl9tj\" (UID: \"579e533f-ff8a-477d-b7ca-99835fec403c\") " pod="openstack/keystone-7d73-account-create-update-tl9tj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.482382 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c240-account-create-update-jfkt9" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.484722 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.485216 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-n2gjj"] Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.496731 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwgjg\" (UniqueName: \"kubernetes.io/projected/579e533f-ff8a-477d-b7ca-99835fec403c-kube-api-access-nwgjg\") pod \"keystone-7d73-account-create-update-tl9tj\" (UID: \"579e533f-ff8a-477d-b7ca-99835fec403c\") " pod="openstack/keystone-7d73-account-create-update-tl9tj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.555755 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c240-account-create-update-jfkt9"] Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.581488 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f670d9f-8208-4b3d-b7f8-902b28d63375-operator-scripts\") pod \"placement-db-create-n2gjj\" (UID: \"6f670d9f-8208-4b3d-b7f8-902b28d63375\") " pod="openstack/placement-db-create-n2gjj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.581538 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8x7p\" (UniqueName: \"kubernetes.io/projected/6f670d9f-8208-4b3d-b7f8-902b28d63375-kube-api-access-v8x7p\") pod \"placement-db-create-n2gjj\" (UID: \"6f670d9f-8208-4b3d-b7f8-902b28d63375\") " pod="openstack/placement-db-create-n2gjj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.581640 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npmpb\" (UniqueName: \"kubernetes.io/projected/5afd2ddf-cdf8-4a71-862f-b22cceae2852-kube-api-access-npmpb\") pod \"placement-c240-account-create-update-jfkt9\" (UID: \"5afd2ddf-cdf8-4a71-862f-b22cceae2852\") " pod="openstack/placement-c240-account-create-update-jfkt9" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.581752 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5afd2ddf-cdf8-4a71-862f-b22cceae2852-operator-scripts\") pod \"placement-c240-account-create-update-jfkt9\" (UID: \"5afd2ddf-cdf8-4a71-862f-b22cceae2852\") " pod="openstack/placement-c240-account-create-update-jfkt9" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.594460 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d73-account-create-update-tl9tj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.663405 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-b42h5"] Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.664475 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-b42h5" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.672284 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9b8a-account-create-update-vk6sb"] Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.673485 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9b8a-account-create-update-vk6sb" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.675188 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.686234 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npmpb\" (UniqueName: \"kubernetes.io/projected/5afd2ddf-cdf8-4a71-862f-b22cceae2852-kube-api-access-npmpb\") pod \"placement-c240-account-create-update-jfkt9\" (UID: \"5afd2ddf-cdf8-4a71-862f-b22cceae2852\") " pod="openstack/placement-c240-account-create-update-jfkt9" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.686279 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5afd2ddf-cdf8-4a71-862f-b22cceae2852-operator-scripts\") pod \"placement-c240-account-create-update-jfkt9\" (UID: \"5afd2ddf-cdf8-4a71-862f-b22cceae2852\") " pod="openstack/placement-c240-account-create-update-jfkt9" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.686344 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f670d9f-8208-4b3d-b7f8-902b28d63375-operator-scripts\") pod \"placement-db-create-n2gjj\" (UID: \"6f670d9f-8208-4b3d-b7f8-902b28d63375\") " pod="openstack/placement-db-create-n2gjj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.686376 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8x7p\" (UniqueName: \"kubernetes.io/projected/6f670d9f-8208-4b3d-b7f8-902b28d63375-kube-api-access-v8x7p\") pod \"placement-db-create-n2gjj\" (UID: \"6f670d9f-8208-4b3d-b7f8-902b28d63375\") " pod="openstack/placement-db-create-n2gjj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.687254 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5afd2ddf-cdf8-4a71-862f-b22cceae2852-operator-scripts\") pod \"placement-c240-account-create-update-jfkt9\" (UID: \"5afd2ddf-cdf8-4a71-862f-b22cceae2852\") " pod="openstack/placement-c240-account-create-update-jfkt9" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.687331 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f670d9f-8208-4b3d-b7f8-902b28d63375-operator-scripts\") pod \"placement-db-create-n2gjj\" (UID: \"6f670d9f-8208-4b3d-b7f8-902b28d63375\") " pod="openstack/placement-db-create-n2gjj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.693666 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-b42h5"] Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.704493 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9b8a-account-create-update-vk6sb"] Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.706073 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8x7p\" (UniqueName: \"kubernetes.io/projected/6f670d9f-8208-4b3d-b7f8-902b28d63375-kube-api-access-v8x7p\") pod \"placement-db-create-n2gjj\" (UID: \"6f670d9f-8208-4b3d-b7f8-902b28d63375\") " pod="openstack/placement-db-create-n2gjj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.711631 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npmpb\" (UniqueName: \"kubernetes.io/projected/5afd2ddf-cdf8-4a71-862f-b22cceae2852-kube-api-access-npmpb\") pod \"placement-c240-account-create-update-jfkt9\" (UID: \"5afd2ddf-cdf8-4a71-862f-b22cceae2852\") " pod="openstack/placement-c240-account-create-update-jfkt9" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.790748 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f89a5da-e099-4e2f-a95b-fdc648424a96-operator-scripts\") pod \"glance-9b8a-account-create-update-vk6sb\" (UID: \"7f89a5da-e099-4e2f-a95b-fdc648424a96\") " pod="openstack/glance-9b8a-account-create-update-vk6sb" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.790964 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbpzq\" (UniqueName: \"kubernetes.io/projected/09c89abf-868c-4405-8b61-c714b4f0a2fc-kube-api-access-mbpzq\") pod \"glance-db-create-b42h5\" (UID: \"09c89abf-868c-4405-8b61-c714b4f0a2fc\") " pod="openstack/glance-db-create-b42h5" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.791134 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09c89abf-868c-4405-8b61-c714b4f0a2fc-operator-scripts\") pod \"glance-db-create-b42h5\" (UID: \"09c89abf-868c-4405-8b61-c714b4f0a2fc\") " pod="openstack/glance-db-create-b42h5" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.791594 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jpct\" (UniqueName: \"kubernetes.io/projected/7f89a5da-e099-4e2f-a95b-fdc648424a96-kube-api-access-7jpct\") pod \"glance-9b8a-account-create-update-vk6sb\" (UID: \"7f89a5da-e099-4e2f-a95b-fdc648424a96\") " pod="openstack/glance-9b8a-account-create-update-vk6sb" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.872232 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n2gjj" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.893344 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbpzq\" (UniqueName: \"kubernetes.io/projected/09c89abf-868c-4405-8b61-c714b4f0a2fc-kube-api-access-mbpzq\") pod \"glance-db-create-b42h5\" (UID: \"09c89abf-868c-4405-8b61-c714b4f0a2fc\") " pod="openstack/glance-db-create-b42h5" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.893450 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09c89abf-868c-4405-8b61-c714b4f0a2fc-operator-scripts\") pod \"glance-db-create-b42h5\" (UID: \"09c89abf-868c-4405-8b61-c714b4f0a2fc\") " pod="openstack/glance-db-create-b42h5" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.893500 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jpct\" (UniqueName: \"kubernetes.io/projected/7f89a5da-e099-4e2f-a95b-fdc648424a96-kube-api-access-7jpct\") pod \"glance-9b8a-account-create-update-vk6sb\" (UID: \"7f89a5da-e099-4e2f-a95b-fdc648424a96\") " pod="openstack/glance-9b8a-account-create-update-vk6sb" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.893621 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f89a5da-e099-4e2f-a95b-fdc648424a96-operator-scripts\") pod \"glance-9b8a-account-create-update-vk6sb\" (UID: \"7f89a5da-e099-4e2f-a95b-fdc648424a96\") " pod="openstack/glance-9b8a-account-create-update-vk6sb" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.894381 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f89a5da-e099-4e2f-a95b-fdc648424a96-operator-scripts\") pod \"glance-9b8a-account-create-update-vk6sb\" (UID: \"7f89a5da-e099-4e2f-a95b-fdc648424a96\") " pod="openstack/glance-9b8a-account-create-update-vk6sb" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.895201 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09c89abf-868c-4405-8b61-c714b4f0a2fc-operator-scripts\") pod \"glance-db-create-b42h5\" (UID: \"09c89abf-868c-4405-8b61-c714b4f0a2fc\") " pod="openstack/glance-db-create-b42h5" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.896402 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c240-account-create-update-jfkt9" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.966006 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jpct\" (UniqueName: \"kubernetes.io/projected/7f89a5da-e099-4e2f-a95b-fdc648424a96-kube-api-access-7jpct\") pod \"glance-9b8a-account-create-update-vk6sb\" (UID: \"7f89a5da-e099-4e2f-a95b-fdc648424a96\") " pod="openstack/glance-9b8a-account-create-update-vk6sb" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.968151 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbpzq\" (UniqueName: \"kubernetes.io/projected/09c89abf-868c-4405-8b61-c714b4f0a2fc-kube-api-access-mbpzq\") pod \"glance-db-create-b42h5\" (UID: \"09c89abf-868c-4405-8b61-c714b4f0a2fc\") " pod="openstack/glance-db-create-b42h5" Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.979894 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mq26l"] Jan 31 09:43:30 crc kubenswrapper[4992]: I0131 09:43:30.994120 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-b42h5" Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.006377 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9b8a-account-create-update-vk6sb" Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.124245 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d73-account-create-update-tl9tj"] Jan 31 09:43:31 crc kubenswrapper[4992]: W0131 09:43:31.143089 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod579e533f_ff8a_477d_b7ca_99835fec403c.slice/crio-94f3ffa232798775650fea97b3d5521397697ad485f63c06d81cda001a1bf62a WatchSource:0}: Error finding container 94f3ffa232798775650fea97b3d5521397697ad485f63c06d81cda001a1bf62a: Status 404 returned error can't find the container with id 94f3ffa232798775650fea97b3d5521397697ad485f63c06d81cda001a1bf62a Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.194911 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7e2707b-6952-4179-8bc8-5d02c57126af" path="/var/lib/kubelet/pods/c7e2707b-6952-4179-8bc8-5d02c57126af/volumes" Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.438478 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c240-account-create-update-jfkt9"] Jan 31 09:43:31 crc kubenswrapper[4992]: W0131 09:43:31.446963 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5afd2ddf_cdf8_4a71_862f_b22cceae2852.slice/crio-aa9e82580550a87746eba5f5dc771339b33888bf86805f32eb88b098713b69d0 WatchSource:0}: Error finding container aa9e82580550a87746eba5f5dc771339b33888bf86805f32eb88b098713b69d0: Status 404 returned error can't find the container with id aa9e82580550a87746eba5f5dc771339b33888bf86805f32eb88b098713b69d0 Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.535528 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-n2gjj"] Jan 31 09:43:31 crc kubenswrapper[4992]: W0131 09:43:31.544159 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f670d9f_8208_4b3d_b7f8_902b28d63375.slice/crio-1647753b2ef0bc70a3bc11bc8b1de6389cf73bb9ae0dad6143a140d121b4fd93 WatchSource:0}: Error finding container 1647753b2ef0bc70a3bc11bc8b1de6389cf73bb9ae0dad6143a140d121b4fd93: Status 404 returned error can't find the container with id 1647753b2ef0bc70a3bc11bc8b1de6389cf73bb9ae0dad6143a140d121b4fd93 Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.621093 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9b8a-account-create-update-vk6sb"] Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.628895 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-b42h5"] Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.796385 4992 generic.go:334] "Generic (PLEG): container finished" podID="b8373451-6f50-4db4-9034-e846de3b5d79" containerID="513b36f53645297f742259b45d5286c8090d81f9e11fe4bfb5dfb00eadf6eec0" exitCode=0 Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.796453 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5wnkp" event={"ID":"b8373451-6f50-4db4-9034-e846de3b5d79","Type":"ContainerDied","Data":"513b36f53645297f742259b45d5286c8090d81f9e11fe4bfb5dfb00eadf6eec0"} Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.797866 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mq26l" event={"ID":"2ceda8d2-7468-42f8-beb8-bf2ac95bea0b","Type":"ContainerStarted","Data":"b915d530c1d6e23ea3b77b88f1cb76e04cb9a79cb0b988cb1a621ddb4e8d6304"} Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.797897 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mq26l" event={"ID":"2ceda8d2-7468-42f8-beb8-bf2ac95bea0b","Type":"ContainerStarted","Data":"191695a819ecbb1109266b56e8b89c56dd409a5747b1814d3b14aebfb500b772"} Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.799560 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9b8a-account-create-update-vk6sb" event={"ID":"7f89a5da-e099-4e2f-a95b-fdc648424a96","Type":"ContainerStarted","Data":"a125c6652d5a45fddfe04455bb030d8b1b580ef1b6ddf6353284f470e3f68563"} Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.799606 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9b8a-account-create-update-vk6sb" event={"ID":"7f89a5da-e099-4e2f-a95b-fdc648424a96","Type":"ContainerStarted","Data":"0e9737c33eed3356b04c43a7fa0e55cafa468354e059a37465c8fe0fde4d91de"} Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.801457 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-b42h5" event={"ID":"09c89abf-868c-4405-8b61-c714b4f0a2fc","Type":"ContainerStarted","Data":"4e49450a3e88467ef58c8a1ade1b4326bab7416669730f3e9ec9b4c1db4447e9"} Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.801484 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-b42h5" event={"ID":"09c89abf-868c-4405-8b61-c714b4f0a2fc","Type":"ContainerStarted","Data":"92f7fe0d2ebd7306d3a7eb7f121ec42cb69252a7120dc411458e6472f84a8f48"} Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.806274 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c240-account-create-update-jfkt9" event={"ID":"5afd2ddf-cdf8-4a71-862f-b22cceae2852","Type":"ContainerStarted","Data":"da1bb5046b900cafc0140c7654537e65d84bfe3cae67fa00a8f81a7cf68e79bd"} Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.806315 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c240-account-create-update-jfkt9" event={"ID":"5afd2ddf-cdf8-4a71-862f-b22cceae2852","Type":"ContainerStarted","Data":"aa9e82580550a87746eba5f5dc771339b33888bf86805f32eb88b098713b69d0"} Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.808411 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n2gjj" event={"ID":"6f670d9f-8208-4b3d-b7f8-902b28d63375","Type":"ContainerStarted","Data":"bec59689753821e3d838bb5be9f9d7584201e77be80bd347e712c396937ba831"} Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.808450 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n2gjj" event={"ID":"6f670d9f-8208-4b3d-b7f8-902b28d63375","Type":"ContainerStarted","Data":"1647753b2ef0bc70a3bc11bc8b1de6389cf73bb9ae0dad6143a140d121b4fd93"} Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.813974 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d73-account-create-update-tl9tj" event={"ID":"579e533f-ff8a-477d-b7ca-99835fec403c","Type":"ContainerStarted","Data":"6c68440cf9fcfce76bd1d78c1272c4a1440d6b4a940a03cf907014d0d8b28d84"} Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.814007 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d73-account-create-update-tl9tj" event={"ID":"579e533f-ff8a-477d-b7ca-99835fec403c","Type":"ContainerStarted","Data":"94f3ffa232798775650fea97b3d5521397697ad485f63c06d81cda001a1bf62a"} Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.832696 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-b42h5" podStartSLOduration=1.832677444 podStartE2EDuration="1.832677444s" podCreationTimestamp="2026-01-31 09:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:43:31.830637536 +0000 UTC m=+1107.802029533" watchObservedRunningTime="2026-01-31 09:43:31.832677444 +0000 UTC m=+1107.804069421" Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.852298 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-9b8a-account-create-update-vk6sb" podStartSLOduration=1.852274999 podStartE2EDuration="1.852274999s" podCreationTimestamp="2026-01-31 09:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:43:31.843729173 +0000 UTC m=+1107.815121170" watchObservedRunningTime="2026-01-31 09:43:31.852274999 +0000 UTC m=+1107.823666986" Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.865336 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-c240-account-create-update-jfkt9" podStartSLOduration=1.8653161950000001 podStartE2EDuration="1.865316195s" podCreationTimestamp="2026-01-31 09:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:43:31.858813828 +0000 UTC m=+1107.830205825" watchObservedRunningTime="2026-01-31 09:43:31.865316195 +0000 UTC m=+1107.836708182" Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.882807 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-n2gjj" podStartSLOduration=1.882788449 podStartE2EDuration="1.882788449s" podCreationTimestamp="2026-01-31 09:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:43:31.879141693 +0000 UTC m=+1107.850533700" watchObservedRunningTime="2026-01-31 09:43:31.882788449 +0000 UTC m=+1107.854180436" Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.899988 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-mq26l" podStartSLOduration=1.899970374 podStartE2EDuration="1.899970374s" podCreationTimestamp="2026-01-31 09:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:43:31.899295774 +0000 UTC m=+1107.870687771" watchObservedRunningTime="2026-01-31 09:43:31.899970374 +0000 UTC m=+1107.871362361" Jan 31 09:43:31 crc kubenswrapper[4992]: I0131 09:43:31.933877 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7d73-account-create-update-tl9tj" podStartSLOduration=1.9338595299999999 podStartE2EDuration="1.93385953s" podCreationTimestamp="2026-01-31 09:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:43:31.928541257 +0000 UTC m=+1107.899933244" watchObservedRunningTime="2026-01-31 09:43:31.93385953 +0000 UTC m=+1107.905251517" Jan 31 09:43:32 crc kubenswrapper[4992]: I0131 09:43:32.824201 4992 generic.go:334] "Generic (PLEG): container finished" podID="5afd2ddf-cdf8-4a71-862f-b22cceae2852" containerID="da1bb5046b900cafc0140c7654537e65d84bfe3cae67fa00a8f81a7cf68e79bd" exitCode=0 Jan 31 09:43:32 crc kubenswrapper[4992]: I0131 09:43:32.824250 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c240-account-create-update-jfkt9" event={"ID":"5afd2ddf-cdf8-4a71-862f-b22cceae2852","Type":"ContainerDied","Data":"da1bb5046b900cafc0140c7654537e65d84bfe3cae67fa00a8f81a7cf68e79bd"} Jan 31 09:43:32 crc kubenswrapper[4992]: I0131 09:43:32.826235 4992 generic.go:334] "Generic (PLEG): container finished" podID="6f670d9f-8208-4b3d-b7f8-902b28d63375" containerID="bec59689753821e3d838bb5be9f9d7584201e77be80bd347e712c396937ba831" exitCode=0 Jan 31 09:43:32 crc kubenswrapper[4992]: I0131 09:43:32.826271 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n2gjj" event={"ID":"6f670d9f-8208-4b3d-b7f8-902b28d63375","Type":"ContainerDied","Data":"bec59689753821e3d838bb5be9f9d7584201e77be80bd347e712c396937ba831"} Jan 31 09:43:32 crc kubenswrapper[4992]: I0131 09:43:32.827798 4992 generic.go:334] "Generic (PLEG): container finished" podID="579e533f-ff8a-477d-b7ca-99835fec403c" containerID="6c68440cf9fcfce76bd1d78c1272c4a1440d6b4a940a03cf907014d0d8b28d84" exitCode=0 Jan 31 09:43:32 crc kubenswrapper[4992]: I0131 09:43:32.827864 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d73-account-create-update-tl9tj" event={"ID":"579e533f-ff8a-477d-b7ca-99835fec403c","Type":"ContainerDied","Data":"6c68440cf9fcfce76bd1d78c1272c4a1440d6b4a940a03cf907014d0d8b28d84"} Jan 31 09:43:32 crc kubenswrapper[4992]: I0131 09:43:32.829310 4992 generic.go:334] "Generic (PLEG): container finished" podID="2ceda8d2-7468-42f8-beb8-bf2ac95bea0b" containerID="b915d530c1d6e23ea3b77b88f1cb76e04cb9a79cb0b988cb1a621ddb4e8d6304" exitCode=0 Jan 31 09:43:32 crc kubenswrapper[4992]: I0131 09:43:32.829360 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mq26l" event={"ID":"2ceda8d2-7468-42f8-beb8-bf2ac95bea0b","Type":"ContainerDied","Data":"b915d530c1d6e23ea3b77b88f1cb76e04cb9a79cb0b988cb1a621ddb4e8d6304"} Jan 31 09:43:32 crc kubenswrapper[4992]: I0131 09:43:32.831172 4992 generic.go:334] "Generic (PLEG): container finished" podID="7f89a5da-e099-4e2f-a95b-fdc648424a96" containerID="a125c6652d5a45fddfe04455bb030d8b1b580ef1b6ddf6353284f470e3f68563" exitCode=0 Jan 31 09:43:32 crc kubenswrapper[4992]: I0131 09:43:32.831283 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9b8a-account-create-update-vk6sb" event={"ID":"7f89a5da-e099-4e2f-a95b-fdc648424a96","Type":"ContainerDied","Data":"a125c6652d5a45fddfe04455bb030d8b1b580ef1b6ddf6353284f470e3f68563"} Jan 31 09:43:32 crc kubenswrapper[4992]: I0131 09:43:32.832941 4992 generic.go:334] "Generic (PLEG): container finished" podID="09c89abf-868c-4405-8b61-c714b4f0a2fc" containerID="4e49450a3e88467ef58c8a1ade1b4326bab7416669730f3e9ec9b4c1db4447e9" exitCode=0 Jan 31 09:43:32 crc kubenswrapper[4992]: I0131 09:43:32.833049 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-b42h5" event={"ID":"09c89abf-868c-4405-8b61-c714b4f0a2fc","Type":"ContainerDied","Data":"4e49450a3e88467ef58c8a1ade1b4326bab7416669730f3e9ec9b4c1db4447e9"} Jan 31 09:43:33 crc kubenswrapper[4992]: I0131 09:43:33.170181 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5wnkp" Jan 31 09:43:33 crc kubenswrapper[4992]: I0131 09:43:33.236110 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6wrl\" (UniqueName: \"kubernetes.io/projected/b8373451-6f50-4db4-9034-e846de3b5d79-kube-api-access-j6wrl\") pod \"b8373451-6f50-4db4-9034-e846de3b5d79\" (UID: \"b8373451-6f50-4db4-9034-e846de3b5d79\") " Jan 31 09:43:33 crc kubenswrapper[4992]: I0131 09:43:33.236189 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8373451-6f50-4db4-9034-e846de3b5d79-operator-scripts\") pod \"b8373451-6f50-4db4-9034-e846de3b5d79\" (UID: \"b8373451-6f50-4db4-9034-e846de3b5d79\") " Jan 31 09:43:33 crc kubenswrapper[4992]: I0131 09:43:33.236836 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8373451-6f50-4db4-9034-e846de3b5d79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b8373451-6f50-4db4-9034-e846de3b5d79" (UID: "b8373451-6f50-4db4-9034-e846de3b5d79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:33 crc kubenswrapper[4992]: I0131 09:43:33.241307 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8373451-6f50-4db4-9034-e846de3b5d79-kube-api-access-j6wrl" (OuterVolumeSpecName: "kube-api-access-j6wrl") pod "b8373451-6f50-4db4-9034-e846de3b5d79" (UID: "b8373451-6f50-4db4-9034-e846de3b5d79"). InnerVolumeSpecName "kube-api-access-j6wrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:43:33 crc kubenswrapper[4992]: I0131 09:43:33.337936 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6wrl\" (UniqueName: \"kubernetes.io/projected/b8373451-6f50-4db4-9034-e846de3b5d79-kube-api-access-j6wrl\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:33 crc kubenswrapper[4992]: I0131 09:43:33.337978 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8373451-6f50-4db4-9034-e846de3b5d79-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:33 crc kubenswrapper[4992]: I0131 09:43:33.621903 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 31 09:43:33 crc kubenswrapper[4992]: I0131 09:43:33.702181 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 31 09:43:33 crc kubenswrapper[4992]: I0131 09:43:33.844148 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5wnkp" event={"ID":"b8373451-6f50-4db4-9034-e846de3b5d79","Type":"ContainerDied","Data":"3d46dc60be1762dc79a41b8e5fd90b9ce9b8a348fabc398b5f16a9a268caf2f4"} Jan 31 09:43:33 crc kubenswrapper[4992]: I0131 09:43:33.844481 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d46dc60be1762dc79a41b8e5fd90b9ce9b8a348fabc398b5f16a9a268caf2f4" Jan 31 09:43:33 crc kubenswrapper[4992]: I0131 09:43:33.844276 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5wnkp" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.153042 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n2gjj" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.257200 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f670d9f-8208-4b3d-b7f8-902b28d63375-operator-scripts\") pod \"6f670d9f-8208-4b3d-b7f8-902b28d63375\" (UID: \"6f670d9f-8208-4b3d-b7f8-902b28d63375\") " Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.257387 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8x7p\" (UniqueName: \"kubernetes.io/projected/6f670d9f-8208-4b3d-b7f8-902b28d63375-kube-api-access-v8x7p\") pod \"6f670d9f-8208-4b3d-b7f8-902b28d63375\" (UID: \"6f670d9f-8208-4b3d-b7f8-902b28d63375\") " Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.260197 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f670d9f-8208-4b3d-b7f8-902b28d63375-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f670d9f-8208-4b3d-b7f8-902b28d63375" (UID: "6f670d9f-8208-4b3d-b7f8-902b28d63375"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.267796 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f670d9f-8208-4b3d-b7f8-902b28d63375-kube-api-access-v8x7p" (OuterVolumeSpecName: "kube-api-access-v8x7p") pod "6f670d9f-8208-4b3d-b7f8-902b28d63375" (UID: "6f670d9f-8208-4b3d-b7f8-902b28d63375"). InnerVolumeSpecName "kube-api-access-v8x7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.359608 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8x7p\" (UniqueName: \"kubernetes.io/projected/6f670d9f-8208-4b3d-b7f8-902b28d63375-kube-api-access-v8x7p\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.359633 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f670d9f-8208-4b3d-b7f8-902b28d63375-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.431897 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-b42h5" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.441247 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d73-account-create-update-tl9tj" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.455216 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c240-account-create-update-jfkt9" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.462119 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mq26l" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.464023 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbpzq\" (UniqueName: \"kubernetes.io/projected/09c89abf-868c-4405-8b61-c714b4f0a2fc-kube-api-access-mbpzq\") pod \"09c89abf-868c-4405-8b61-c714b4f0a2fc\" (UID: \"09c89abf-868c-4405-8b61-c714b4f0a2fc\") " Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.464175 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09c89abf-868c-4405-8b61-c714b4f0a2fc-operator-scripts\") pod \"09c89abf-868c-4405-8b61-c714b4f0a2fc\" (UID: \"09c89abf-868c-4405-8b61-c714b4f0a2fc\") " Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.464619 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09c89abf-868c-4405-8b61-c714b4f0a2fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "09c89abf-868c-4405-8b61-c714b4f0a2fc" (UID: "09c89abf-868c-4405-8b61-c714b4f0a2fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.473771 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9b8a-account-create-update-vk6sb" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.478645 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09c89abf-868c-4405-8b61-c714b4f0a2fc-kube-api-access-mbpzq" (OuterVolumeSpecName: "kube-api-access-mbpzq") pod "09c89abf-868c-4405-8b61-c714b4f0a2fc" (UID: "09c89abf-868c-4405-8b61-c714b4f0a2fc"). InnerVolumeSpecName "kube-api-access-mbpzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.567156 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwgjg\" (UniqueName: \"kubernetes.io/projected/579e533f-ff8a-477d-b7ca-99835fec403c-kube-api-access-nwgjg\") pod \"579e533f-ff8a-477d-b7ca-99835fec403c\" (UID: \"579e533f-ff8a-477d-b7ca-99835fec403c\") " Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.567272 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdgqk\" (UniqueName: \"kubernetes.io/projected/2ceda8d2-7468-42f8-beb8-bf2ac95bea0b-kube-api-access-mdgqk\") pod \"2ceda8d2-7468-42f8-beb8-bf2ac95bea0b\" (UID: \"2ceda8d2-7468-42f8-beb8-bf2ac95bea0b\") " Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.567307 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5afd2ddf-cdf8-4a71-862f-b22cceae2852-operator-scripts\") pod \"5afd2ddf-cdf8-4a71-862f-b22cceae2852\" (UID: \"5afd2ddf-cdf8-4a71-862f-b22cceae2852\") " Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.567337 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ceda8d2-7468-42f8-beb8-bf2ac95bea0b-operator-scripts\") pod \"2ceda8d2-7468-42f8-beb8-bf2ac95bea0b\" (UID: \"2ceda8d2-7468-42f8-beb8-bf2ac95bea0b\") " Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.567392 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jpct\" (UniqueName: \"kubernetes.io/projected/7f89a5da-e099-4e2f-a95b-fdc648424a96-kube-api-access-7jpct\") pod \"7f89a5da-e099-4e2f-a95b-fdc648424a96\" (UID: \"7f89a5da-e099-4e2f-a95b-fdc648424a96\") " Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.567456 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/579e533f-ff8a-477d-b7ca-99835fec403c-operator-scripts\") pod \"579e533f-ff8a-477d-b7ca-99835fec403c\" (UID: \"579e533f-ff8a-477d-b7ca-99835fec403c\") " Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.567511 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npmpb\" (UniqueName: \"kubernetes.io/projected/5afd2ddf-cdf8-4a71-862f-b22cceae2852-kube-api-access-npmpb\") pod \"5afd2ddf-cdf8-4a71-862f-b22cceae2852\" (UID: \"5afd2ddf-cdf8-4a71-862f-b22cceae2852\") " Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.571859 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ceda8d2-7468-42f8-beb8-bf2ac95bea0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ceda8d2-7468-42f8-beb8-bf2ac95bea0b" (UID: "2ceda8d2-7468-42f8-beb8-bf2ac95bea0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.571920 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/579e533f-ff8a-477d-b7ca-99835fec403c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "579e533f-ff8a-477d-b7ca-99835fec403c" (UID: "579e533f-ff8a-477d-b7ca-99835fec403c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.572278 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5afd2ddf-cdf8-4a71-862f-b22cceae2852-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5afd2ddf-cdf8-4a71-862f-b22cceae2852" (UID: "5afd2ddf-cdf8-4a71-862f-b22cceae2852"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.573600 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f89a5da-e099-4e2f-a95b-fdc648424a96-operator-scripts\") pod \"7f89a5da-e099-4e2f-a95b-fdc648424a96\" (UID: \"7f89a5da-e099-4e2f-a95b-fdc648424a96\") " Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.574558 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f89a5da-e099-4e2f-a95b-fdc648424a96-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f89a5da-e099-4e2f-a95b-fdc648424a96" (UID: "7f89a5da-e099-4e2f-a95b-fdc648424a96"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.574591 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/579e533f-ff8a-477d-b7ca-99835fec403c-kube-api-access-nwgjg" (OuterVolumeSpecName: "kube-api-access-nwgjg") pod "579e533f-ff8a-477d-b7ca-99835fec403c" (UID: "579e533f-ff8a-477d-b7ca-99835fec403c"). InnerVolumeSpecName "kube-api-access-nwgjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.575243 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5afd2ddf-cdf8-4a71-862f-b22cceae2852-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.575265 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ceda8d2-7468-42f8-beb8-bf2ac95bea0b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.575530 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbpzq\" (UniqueName: \"kubernetes.io/projected/09c89abf-868c-4405-8b61-c714b4f0a2fc-kube-api-access-mbpzq\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.575541 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/579e533f-ff8a-477d-b7ca-99835fec403c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.575550 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f89a5da-e099-4e2f-a95b-fdc648424a96-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.575559 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwgjg\" (UniqueName: \"kubernetes.io/projected/579e533f-ff8a-477d-b7ca-99835fec403c-kube-api-access-nwgjg\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.575568 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09c89abf-868c-4405-8b61-c714b4f0a2fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.575683 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ceda8d2-7468-42f8-beb8-bf2ac95bea0b-kube-api-access-mdgqk" (OuterVolumeSpecName: "kube-api-access-mdgqk") pod "2ceda8d2-7468-42f8-beb8-bf2ac95bea0b" (UID: "2ceda8d2-7468-42f8-beb8-bf2ac95bea0b"). InnerVolumeSpecName "kube-api-access-mdgqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.579121 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f89a5da-e099-4e2f-a95b-fdc648424a96-kube-api-access-7jpct" (OuterVolumeSpecName: "kube-api-access-7jpct") pod "7f89a5da-e099-4e2f-a95b-fdc648424a96" (UID: "7f89a5da-e099-4e2f-a95b-fdc648424a96"). InnerVolumeSpecName "kube-api-access-7jpct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.580401 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5afd2ddf-cdf8-4a71-862f-b22cceae2852-kube-api-access-npmpb" (OuterVolumeSpecName: "kube-api-access-npmpb") pod "5afd2ddf-cdf8-4a71-862f-b22cceae2852" (UID: "5afd2ddf-cdf8-4a71-862f-b22cceae2852"). InnerVolumeSpecName "kube-api-access-npmpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.677356 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npmpb\" (UniqueName: \"kubernetes.io/projected/5afd2ddf-cdf8-4a71-862f-b22cceae2852-kube-api-access-npmpb\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.677699 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdgqk\" (UniqueName: \"kubernetes.io/projected/2ceda8d2-7468-42f8-beb8-bf2ac95bea0b-kube-api-access-mdgqk\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.677718 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jpct\" (UniqueName: \"kubernetes.io/projected/7f89a5da-e099-4e2f-a95b-fdc648424a96-kube-api-access-7jpct\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.852818 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-b42h5" event={"ID":"09c89abf-868c-4405-8b61-c714b4f0a2fc","Type":"ContainerDied","Data":"92f7fe0d2ebd7306d3a7eb7f121ec42cb69252a7120dc411458e6472f84a8f48"} Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.852867 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92f7fe0d2ebd7306d3a7eb7f121ec42cb69252a7120dc411458e6472f84a8f48" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.852953 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-b42h5" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.860259 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c240-account-create-update-jfkt9" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.860275 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c240-account-create-update-jfkt9" event={"ID":"5afd2ddf-cdf8-4a71-862f-b22cceae2852","Type":"ContainerDied","Data":"aa9e82580550a87746eba5f5dc771339b33888bf86805f32eb88b098713b69d0"} Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.860722 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa9e82580550a87746eba5f5dc771339b33888bf86805f32eb88b098713b69d0" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.862269 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n2gjj" event={"ID":"6f670d9f-8208-4b3d-b7f8-902b28d63375","Type":"ContainerDied","Data":"1647753b2ef0bc70a3bc11bc8b1de6389cf73bb9ae0dad6143a140d121b4fd93"} Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.862322 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1647753b2ef0bc70a3bc11bc8b1de6389cf73bb9ae0dad6143a140d121b4fd93" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.862301 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n2gjj" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.863985 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d73-account-create-update-tl9tj" event={"ID":"579e533f-ff8a-477d-b7ca-99835fec403c","Type":"ContainerDied","Data":"94f3ffa232798775650fea97b3d5521397697ad485f63c06d81cda001a1bf62a"} Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.864009 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94f3ffa232798775650fea97b3d5521397697ad485f63c06d81cda001a1bf62a" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.864069 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d73-account-create-update-tl9tj" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.871102 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mq26l" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.871005 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mq26l" event={"ID":"2ceda8d2-7468-42f8-beb8-bf2ac95bea0b","Type":"ContainerDied","Data":"191695a819ecbb1109266b56e8b89c56dd409a5747b1814d3b14aebfb500b772"} Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.871391 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="191695a819ecbb1109266b56e8b89c56dd409a5747b1814d3b14aebfb500b772" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.872295 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9b8a-account-create-update-vk6sb" event={"ID":"7f89a5da-e099-4e2f-a95b-fdc648424a96","Type":"ContainerDied","Data":"0e9737c33eed3356b04c43a7fa0e55cafa468354e059a37465c8fe0fde4d91de"} Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.872332 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e9737c33eed3356b04c43a7fa0e55cafa468354e059a37465c8fe0fde4d91de" Jan 31 09:43:34 crc kubenswrapper[4992]: I0131 09:43:34.872373 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9b8a-account-create-update-vk6sb" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.822505 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-96xss"] Jan 31 09:43:35 crc kubenswrapper[4992]: E0131 09:43:35.822853 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8373451-6f50-4db4-9034-e846de3b5d79" containerName="mariadb-account-create-update" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.822875 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8373451-6f50-4db4-9034-e846de3b5d79" containerName="mariadb-account-create-update" Jan 31 09:43:35 crc kubenswrapper[4992]: E0131 09:43:35.822891 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f670d9f-8208-4b3d-b7f8-902b28d63375" containerName="mariadb-database-create" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.822902 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f670d9f-8208-4b3d-b7f8-902b28d63375" containerName="mariadb-database-create" Jan 31 09:43:35 crc kubenswrapper[4992]: E0131 09:43:35.822916 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5afd2ddf-cdf8-4a71-862f-b22cceae2852" containerName="mariadb-account-create-update" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.822924 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5afd2ddf-cdf8-4a71-862f-b22cceae2852" containerName="mariadb-account-create-update" Jan 31 09:43:35 crc kubenswrapper[4992]: E0131 09:43:35.822939 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ceda8d2-7468-42f8-beb8-bf2ac95bea0b" containerName="mariadb-database-create" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.822945 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ceda8d2-7468-42f8-beb8-bf2ac95bea0b" containerName="mariadb-database-create" Jan 31 09:43:35 crc kubenswrapper[4992]: E0131 09:43:35.822959 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09c89abf-868c-4405-8b61-c714b4f0a2fc" containerName="mariadb-database-create" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.822965 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="09c89abf-868c-4405-8b61-c714b4f0a2fc" containerName="mariadb-database-create" Jan 31 09:43:35 crc kubenswrapper[4992]: E0131 09:43:35.822973 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="579e533f-ff8a-477d-b7ca-99835fec403c" containerName="mariadb-account-create-update" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.822979 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="579e533f-ff8a-477d-b7ca-99835fec403c" containerName="mariadb-account-create-update" Jan 31 09:43:35 crc kubenswrapper[4992]: E0131 09:43:35.822987 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f89a5da-e099-4e2f-a95b-fdc648424a96" containerName="mariadb-account-create-update" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.822992 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f89a5da-e099-4e2f-a95b-fdc648424a96" containerName="mariadb-account-create-update" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.823137 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ceda8d2-7468-42f8-beb8-bf2ac95bea0b" containerName="mariadb-database-create" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.823151 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="5afd2ddf-cdf8-4a71-862f-b22cceae2852" containerName="mariadb-account-create-update" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.823161 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8373451-6f50-4db4-9034-e846de3b5d79" containerName="mariadb-account-create-update" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.823172 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="09c89abf-868c-4405-8b61-c714b4f0a2fc" containerName="mariadb-database-create" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.823188 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f89a5da-e099-4e2f-a95b-fdc648424a96" containerName="mariadb-account-create-update" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.823200 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="579e533f-ff8a-477d-b7ca-99835fec403c" containerName="mariadb-account-create-update" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.823208 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f670d9f-8208-4b3d-b7f8-902b28d63375" containerName="mariadb-database-create" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.823808 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-96xss" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.825898 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.831152 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-g4lhv" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.836893 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-96xss"] Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.893519 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-config-data\") pod \"glance-db-sync-96xss\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " pod="openstack/glance-db-sync-96xss" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.893687 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-combined-ca-bundle\") pod \"glance-db-sync-96xss\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " pod="openstack/glance-db-sync-96xss" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.893730 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-db-sync-config-data\") pod \"glance-db-sync-96xss\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " pod="openstack/glance-db-sync-96xss" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.893787 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt6q9\" (UniqueName: \"kubernetes.io/projected/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-kube-api-access-xt6q9\") pod \"glance-db-sync-96xss\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " pod="openstack/glance-db-sync-96xss" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.995592 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-config-data\") pod \"glance-db-sync-96xss\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " pod="openstack/glance-db-sync-96xss" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.995928 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-combined-ca-bundle\") pod \"glance-db-sync-96xss\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " pod="openstack/glance-db-sync-96xss" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.996062 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-db-sync-config-data\") pod \"glance-db-sync-96xss\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " pod="openstack/glance-db-sync-96xss" Jan 31 09:43:35 crc kubenswrapper[4992]: I0131 09:43:35.996181 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt6q9\" (UniqueName: \"kubernetes.io/projected/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-kube-api-access-xt6q9\") pod \"glance-db-sync-96xss\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " pod="openstack/glance-db-sync-96xss" Jan 31 09:43:36 crc kubenswrapper[4992]: I0131 09:43:36.000276 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-config-data\") pod \"glance-db-sync-96xss\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " pod="openstack/glance-db-sync-96xss" Jan 31 09:43:36 crc kubenswrapper[4992]: I0131 09:43:36.003024 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-db-sync-config-data\") pod \"glance-db-sync-96xss\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " pod="openstack/glance-db-sync-96xss" Jan 31 09:43:36 crc kubenswrapper[4992]: I0131 09:43:36.006049 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-combined-ca-bundle\") pod \"glance-db-sync-96xss\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " pod="openstack/glance-db-sync-96xss" Jan 31 09:43:36 crc kubenswrapper[4992]: I0131 09:43:36.013360 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt6q9\" (UniqueName: \"kubernetes.io/projected/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-kube-api-access-xt6q9\") pod \"glance-db-sync-96xss\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " pod="openstack/glance-db-sync-96xss" Jan 31 09:43:36 crc kubenswrapper[4992]: I0131 09:43:36.175826 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-96xss" Jan 31 09:43:36 crc kubenswrapper[4992]: I0131 09:43:36.718449 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-96xss"] Jan 31 09:43:36 crc kubenswrapper[4992]: I0131 09:43:36.887642 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-96xss" event={"ID":"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d","Type":"ContainerStarted","Data":"13a8490ba61fa49faac6b96588bd2c303b920bd2a8f9968ec309c4c5ada1721c"} Jan 31 09:43:38 crc kubenswrapper[4992]: I0131 09:43:38.618206 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 31 09:43:38 crc kubenswrapper[4992]: I0131 09:43:38.975991 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5wnkp"] Jan 31 09:43:38 crc kubenswrapper[4992]: I0131 09:43:38.981163 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-5wnkp"] Jan 31 09:43:39 crc kubenswrapper[4992]: I0131 09:43:39.038878 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-mhq92"] Jan 31 09:43:39 crc kubenswrapper[4992]: I0131 09:43:39.040175 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mhq92" Jan 31 09:43:39 crc kubenswrapper[4992]: I0131 09:43:39.042290 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 31 09:43:39 crc kubenswrapper[4992]: I0131 09:43:39.057906 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mhq92"] Jan 31 09:43:39 crc kubenswrapper[4992]: I0131 09:43:39.158625 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a74fed1c-c0a3-4120-9724-50d5000661bb-operator-scripts\") pod \"root-account-create-update-mhq92\" (UID: \"a74fed1c-c0a3-4120-9724-50d5000661bb\") " pod="openstack/root-account-create-update-mhq92" Jan 31 09:43:39 crc kubenswrapper[4992]: I0131 09:43:39.158831 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl9h5\" (UniqueName: \"kubernetes.io/projected/a74fed1c-c0a3-4120-9724-50d5000661bb-kube-api-access-pl9h5\") pod \"root-account-create-update-mhq92\" (UID: \"a74fed1c-c0a3-4120-9724-50d5000661bb\") " pod="openstack/root-account-create-update-mhq92" Jan 31 09:43:39 crc kubenswrapper[4992]: I0131 09:43:39.192032 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8373451-6f50-4db4-9034-e846de3b5d79" path="/var/lib/kubelet/pods/b8373451-6f50-4db4-9034-e846de3b5d79/volumes" Jan 31 09:43:39 crc kubenswrapper[4992]: I0131 09:43:39.260337 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a74fed1c-c0a3-4120-9724-50d5000661bb-operator-scripts\") pod \"root-account-create-update-mhq92\" (UID: \"a74fed1c-c0a3-4120-9724-50d5000661bb\") " pod="openstack/root-account-create-update-mhq92" Jan 31 09:43:39 crc kubenswrapper[4992]: I0131 09:43:39.260519 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl9h5\" (UniqueName: \"kubernetes.io/projected/a74fed1c-c0a3-4120-9724-50d5000661bb-kube-api-access-pl9h5\") pod \"root-account-create-update-mhq92\" (UID: \"a74fed1c-c0a3-4120-9724-50d5000661bb\") " pod="openstack/root-account-create-update-mhq92" Jan 31 09:43:39 crc kubenswrapper[4992]: I0131 09:43:39.262735 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a74fed1c-c0a3-4120-9724-50d5000661bb-operator-scripts\") pod \"root-account-create-update-mhq92\" (UID: \"a74fed1c-c0a3-4120-9724-50d5000661bb\") " pod="openstack/root-account-create-update-mhq92" Jan 31 09:43:39 crc kubenswrapper[4992]: I0131 09:43:39.289779 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl9h5\" (UniqueName: \"kubernetes.io/projected/a74fed1c-c0a3-4120-9724-50d5000661bb-kube-api-access-pl9h5\") pod \"root-account-create-update-mhq92\" (UID: \"a74fed1c-c0a3-4120-9724-50d5000661bb\") " pod="openstack/root-account-create-update-mhq92" Jan 31 09:43:39 crc kubenswrapper[4992]: I0131 09:43:39.362029 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mhq92" Jan 31 09:43:39 crc kubenswrapper[4992]: I0131 09:43:39.853292 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mhq92"] Jan 31 09:43:39 crc kubenswrapper[4992]: W0131 09:43:39.876042 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda74fed1c_c0a3_4120_9724_50d5000661bb.slice/crio-0ebddf4917b79cd43f0a4736feb39c4f8fda2b0562ee51bde25799f56bfd1a00 WatchSource:0}: Error finding container 0ebddf4917b79cd43f0a4736feb39c4f8fda2b0562ee51bde25799f56bfd1a00: Status 404 returned error can't find the container with id 0ebddf4917b79cd43f0a4736feb39c4f8fda2b0562ee51bde25799f56bfd1a00 Jan 31 09:43:39 crc kubenswrapper[4992]: I0131 09:43:39.910762 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mhq92" event={"ID":"a74fed1c-c0a3-4120-9724-50d5000661bb","Type":"ContainerStarted","Data":"0ebddf4917b79cd43f0a4736feb39c4f8fda2b0562ee51bde25799f56bfd1a00"} Jan 31 09:43:40 crc kubenswrapper[4992]: I0131 09:43:40.921781 4992 generic.go:334] "Generic (PLEG): container finished" podID="a74fed1c-c0a3-4120-9724-50d5000661bb" containerID="d65b48f8cf341fdd90cf9a1a76c3229b7c44a83699a17d5c00c62fecf4da588e" exitCode=0 Jan 31 09:43:40 crc kubenswrapper[4992]: I0131 09:43:40.921827 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mhq92" event={"ID":"a74fed1c-c0a3-4120-9724-50d5000661bb","Type":"ContainerDied","Data":"d65b48f8cf341fdd90cf9a1a76c3229b7c44a83699a17d5c00c62fecf4da588e"} Jan 31 09:43:44 crc kubenswrapper[4992]: I0131 09:43:44.953181 4992 generic.go:334] "Generic (PLEG): container finished" podID="8005e2e7-ed00-4af1-be65-12638ce3a9f9" containerID="7528d895a9359285fce439f82f52be161eb89351642e8d36a40b0419f286cfdc" exitCode=0 Jan 31 09:43:44 crc kubenswrapper[4992]: I0131 09:43:44.953274 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8005e2e7-ed00-4af1-be65-12638ce3a9f9","Type":"ContainerDied","Data":"7528d895a9359285fce439f82f52be161eb89351642e8d36a40b0419f286cfdc"} Jan 31 09:43:44 crc kubenswrapper[4992]: I0131 09:43:44.956408 4992 generic.go:334] "Generic (PLEG): container finished" podID="71b7a97b-2d62-4b05-84f6-fc720ce9c672" containerID="d22d9ae6579988e4f2c265a9155d5f7266ad4c61c07fd18ab71ac6a17f9af9aa" exitCode=0 Jan 31 09:43:44 crc kubenswrapper[4992]: I0131 09:43:44.956466 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"71b7a97b-2d62-4b05-84f6-fc720ce9c672","Type":"ContainerDied","Data":"d22d9ae6579988e4f2c265a9155d5f7266ad4c61c07fd18ab71ac6a17f9af9aa"} Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.461120 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vxtkq" podUID="9aca01f7-a9ff-4d25-a330-c505e93a3cd0" containerName="ovn-controller" probeResult="failure" output=< Jan 31 09:43:46 crc kubenswrapper[4992]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 31 09:43:46 crc kubenswrapper[4992]: > Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.484818 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.496799 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-l45p8" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.702875 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vxtkq-config-577cn"] Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.703880 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.714611 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vxtkq-config-577cn"] Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.715024 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.803303 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-run\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.803389 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-run-ovn\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.803413 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pzjj\" (UniqueName: \"kubernetes.io/projected/b0cd4435-8c0c-41dc-8e7d-783bb8052959-kube-api-access-2pzjj\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.803491 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-log-ovn\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.803510 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b0cd4435-8c0c-41dc-8e7d-783bb8052959-additional-scripts\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.803687 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0cd4435-8c0c-41dc-8e7d-783bb8052959-scripts\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.905266 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-run\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.905312 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-run-ovn\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.905338 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pzjj\" (UniqueName: \"kubernetes.io/projected/b0cd4435-8c0c-41dc-8e7d-783bb8052959-kube-api-access-2pzjj\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.905368 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-log-ovn\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.905391 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b0cd4435-8c0c-41dc-8e7d-783bb8052959-additional-scripts\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.905483 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0cd4435-8c0c-41dc-8e7d-783bb8052959-scripts\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.905637 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-run\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.905663 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-run-ovn\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.905661 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-log-ovn\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.906252 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b0cd4435-8c0c-41dc-8e7d-783bb8052959-additional-scripts\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.907650 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0cd4435-8c0c-41dc-8e7d-783bb8052959-scripts\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:46 crc kubenswrapper[4992]: I0131 09:43:46.944290 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pzjj\" (UniqueName: \"kubernetes.io/projected/b0cd4435-8c0c-41dc-8e7d-783bb8052959-kube-api-access-2pzjj\") pod \"ovn-controller-vxtkq-config-577cn\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:47 crc kubenswrapper[4992]: I0131 09:43:47.040052 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:49 crc kubenswrapper[4992]: I0131 09:43:49.908272 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mhq92" Jan 31 09:43:49 crc kubenswrapper[4992]: I0131 09:43:49.959181 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a74fed1c-c0a3-4120-9724-50d5000661bb-operator-scripts\") pod \"a74fed1c-c0a3-4120-9724-50d5000661bb\" (UID: \"a74fed1c-c0a3-4120-9724-50d5000661bb\") " Jan 31 09:43:49 crc kubenswrapper[4992]: I0131 09:43:49.959226 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl9h5\" (UniqueName: \"kubernetes.io/projected/a74fed1c-c0a3-4120-9724-50d5000661bb-kube-api-access-pl9h5\") pod \"a74fed1c-c0a3-4120-9724-50d5000661bb\" (UID: \"a74fed1c-c0a3-4120-9724-50d5000661bb\") " Jan 31 09:43:49 crc kubenswrapper[4992]: I0131 09:43:49.962026 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a74fed1c-c0a3-4120-9724-50d5000661bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a74fed1c-c0a3-4120-9724-50d5000661bb" (UID: "a74fed1c-c0a3-4120-9724-50d5000661bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:49 crc kubenswrapper[4992]: I0131 09:43:49.965131 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a74fed1c-c0a3-4120-9724-50d5000661bb-kube-api-access-pl9h5" (OuterVolumeSpecName: "kube-api-access-pl9h5") pod "a74fed1c-c0a3-4120-9724-50d5000661bb" (UID: "a74fed1c-c0a3-4120-9724-50d5000661bb"). InnerVolumeSpecName "kube-api-access-pl9h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:43:50 crc kubenswrapper[4992]: I0131 09:43:50.002642 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mhq92" event={"ID":"a74fed1c-c0a3-4120-9724-50d5000661bb","Type":"ContainerDied","Data":"0ebddf4917b79cd43f0a4736feb39c4f8fda2b0562ee51bde25799f56bfd1a00"} Jan 31 09:43:50 crc kubenswrapper[4992]: I0131 09:43:50.002693 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mhq92" Jan 31 09:43:50 crc kubenswrapper[4992]: I0131 09:43:50.002705 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ebddf4917b79cd43f0a4736feb39c4f8fda2b0562ee51bde25799f56bfd1a00" Jan 31 09:43:50 crc kubenswrapper[4992]: I0131 09:43:50.061189 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a74fed1c-c0a3-4120-9724-50d5000661bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:50 crc kubenswrapper[4992]: I0131 09:43:50.061227 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl9h5\" (UniqueName: \"kubernetes.io/projected/a74fed1c-c0a3-4120-9724-50d5000661bb-kube-api-access-pl9h5\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:50 crc kubenswrapper[4992]: I0131 09:43:50.196834 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vxtkq-config-577cn"] Jan 31 09:43:50 crc kubenswrapper[4992]: W0131 09:43:50.206572 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0cd4435_8c0c_41dc_8e7d_783bb8052959.slice/crio-73058ca057260e4435af0993ec397cba3a7ddf8bae2575345579a67d251b66e9 WatchSource:0}: Error finding container 73058ca057260e4435af0993ec397cba3a7ddf8bae2575345579a67d251b66e9: Status 404 returned error can't find the container with id 73058ca057260e4435af0993ec397cba3a7ddf8bae2575345579a67d251b66e9 Jan 31 09:43:51 crc kubenswrapper[4992]: I0131 09:43:51.011052 4992 generic.go:334] "Generic (PLEG): container finished" podID="b0cd4435-8c0c-41dc-8e7d-783bb8052959" containerID="752ccfd32048876456e648088b75041d67eb462b98eb9b039c68910f9ac3eab7" exitCode=0 Jan 31 09:43:51 crc kubenswrapper[4992]: I0131 09:43:51.011162 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vxtkq-config-577cn" event={"ID":"b0cd4435-8c0c-41dc-8e7d-783bb8052959","Type":"ContainerDied","Data":"752ccfd32048876456e648088b75041d67eb462b98eb9b039c68910f9ac3eab7"} Jan 31 09:43:51 crc kubenswrapper[4992]: I0131 09:43:51.011684 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vxtkq-config-577cn" event={"ID":"b0cd4435-8c0c-41dc-8e7d-783bb8052959","Type":"ContainerStarted","Data":"73058ca057260e4435af0993ec397cba3a7ddf8bae2575345579a67d251b66e9"} Jan 31 09:43:51 crc kubenswrapper[4992]: I0131 09:43:51.013760 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8005e2e7-ed00-4af1-be65-12638ce3a9f9","Type":"ContainerStarted","Data":"3160b7bf213a050caccbc202ef1bc48be0ea7daa7c411af5744ac7f8e303beda"} Jan 31 09:43:51 crc kubenswrapper[4992]: I0131 09:43:51.014387 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:43:51 crc kubenswrapper[4992]: I0131 09:43:51.015400 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"71b7a97b-2d62-4b05-84f6-fc720ce9c672","Type":"ContainerStarted","Data":"cb3f7bf108bf934f980a69df97114683a867fed757511e2d4bb8c17bc98f62be"} Jan 31 09:43:51 crc kubenswrapper[4992]: I0131 09:43:51.015696 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 31 09:43:51 crc kubenswrapper[4992]: I0131 09:43:51.016575 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-96xss" event={"ID":"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d","Type":"ContainerStarted","Data":"c8dbd09a79b362ae518f1af8233235c2d81e2132f2ca037d570628b1a9b467d4"} Jan 31 09:43:51 crc kubenswrapper[4992]: I0131 09:43:51.060778 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-96xss" podStartSLOduration=2.935890249 podStartE2EDuration="16.060757912s" podCreationTimestamp="2026-01-31 09:43:35 +0000 UTC" firstStartedPulling="2026-01-31 09:43:36.726806617 +0000 UTC m=+1112.698198614" lastFinishedPulling="2026-01-31 09:43:49.85167429 +0000 UTC m=+1125.823066277" observedRunningTime="2026-01-31 09:43:51.046949054 +0000 UTC m=+1127.018341071" watchObservedRunningTime="2026-01-31 09:43:51.060757912 +0000 UTC m=+1127.032149909" Jan 31 09:43:51 crc kubenswrapper[4992]: I0131 09:43:51.084978 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=52.861645339 podStartE2EDuration="1m5.084958939s" podCreationTimestamp="2026-01-31 09:42:46 +0000 UTC" firstStartedPulling="2026-01-31 09:42:57.822986371 +0000 UTC m=+1073.794378358" lastFinishedPulling="2026-01-31 09:43:10.046299981 +0000 UTC m=+1086.017691958" observedRunningTime="2026-01-31 09:43:51.071682927 +0000 UTC m=+1127.043074954" watchObservedRunningTime="2026-01-31 09:43:51.084958939 +0000 UTC m=+1127.056350926" Jan 31 09:43:51 crc kubenswrapper[4992]: I0131 09:43:51.114238 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=58.966936638 podStartE2EDuration="1m5.114220462s" podCreationTimestamp="2026-01-31 09:42:46 +0000 UTC" firstStartedPulling="2026-01-31 09:43:03.899190832 +0000 UTC m=+1079.870582819" lastFinishedPulling="2026-01-31 09:43:10.046474656 +0000 UTC m=+1086.017866643" observedRunningTime="2026-01-31 09:43:51.107131358 +0000 UTC m=+1127.078523345" watchObservedRunningTime="2026-01-31 09:43:51.114220462 +0000 UTC m=+1127.085612449" Jan 31 09:43:51 crc kubenswrapper[4992]: I0131 09:43:51.469148 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-vxtkq" Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.324361 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.398555 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b0cd4435-8c0c-41dc-8e7d-783bb8052959-additional-scripts\") pod \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.398620 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pzjj\" (UniqueName: \"kubernetes.io/projected/b0cd4435-8c0c-41dc-8e7d-783bb8052959-kube-api-access-2pzjj\") pod \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.398699 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-run-ovn\") pod \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.398772 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0cd4435-8c0c-41dc-8e7d-783bb8052959-scripts\") pod \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.398818 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-log-ovn\") pod \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.398835 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-run\") pod \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\" (UID: \"b0cd4435-8c0c-41dc-8e7d-783bb8052959\") " Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.399188 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-run" (OuterVolumeSpecName: "var-run") pod "b0cd4435-8c0c-41dc-8e7d-783bb8052959" (UID: "b0cd4435-8c0c-41dc-8e7d-783bb8052959"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.399918 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b0cd4435-8c0c-41dc-8e7d-783bb8052959" (UID: "b0cd4435-8c0c-41dc-8e7d-783bb8052959"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.400077 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0cd4435-8c0c-41dc-8e7d-783bb8052959-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b0cd4435-8c0c-41dc-8e7d-783bb8052959" (UID: "b0cd4435-8c0c-41dc-8e7d-783bb8052959"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.400112 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b0cd4435-8c0c-41dc-8e7d-783bb8052959" (UID: "b0cd4435-8c0c-41dc-8e7d-783bb8052959"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.400767 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0cd4435-8c0c-41dc-8e7d-783bb8052959-scripts" (OuterVolumeSpecName: "scripts") pod "b0cd4435-8c0c-41dc-8e7d-783bb8052959" (UID: "b0cd4435-8c0c-41dc-8e7d-783bb8052959"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.404358 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0cd4435-8c0c-41dc-8e7d-783bb8052959-kube-api-access-2pzjj" (OuterVolumeSpecName: "kube-api-access-2pzjj") pod "b0cd4435-8c0c-41dc-8e7d-783bb8052959" (UID: "b0cd4435-8c0c-41dc-8e7d-783bb8052959"). InnerVolumeSpecName "kube-api-access-2pzjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.501474 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0cd4435-8c0c-41dc-8e7d-783bb8052959-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.501528 4992 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.501547 4992 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-run\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.501564 4992 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b0cd4435-8c0c-41dc-8e7d-783bb8052959-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.501588 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pzjj\" (UniqueName: \"kubernetes.io/projected/b0cd4435-8c0c-41dc-8e7d-783bb8052959-kube-api-access-2pzjj\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:52 crc kubenswrapper[4992]: I0131 09:43:52.501605 4992 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0cd4435-8c0c-41dc-8e7d-783bb8052959-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:53 crc kubenswrapper[4992]: I0131 09:43:53.034454 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vxtkq-config-577cn" event={"ID":"b0cd4435-8c0c-41dc-8e7d-783bb8052959","Type":"ContainerDied","Data":"73058ca057260e4435af0993ec397cba3a7ddf8bae2575345579a67d251b66e9"} Jan 31 09:43:53 crc kubenswrapper[4992]: I0131 09:43:53.034708 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73058ca057260e4435af0993ec397cba3a7ddf8bae2575345579a67d251b66e9" Jan 31 09:43:53 crc kubenswrapper[4992]: I0131 09:43:53.034494 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vxtkq-config-577cn" Jan 31 09:43:53 crc kubenswrapper[4992]: I0131 09:43:53.411004 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vxtkq-config-577cn"] Jan 31 09:43:53 crc kubenswrapper[4992]: I0131 09:43:53.416347 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vxtkq-config-577cn"] Jan 31 09:43:55 crc kubenswrapper[4992]: I0131 09:43:55.193547 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0cd4435-8c0c-41dc-8e7d-783bb8052959" path="/var/lib/kubelet/pods/b0cd4435-8c0c-41dc-8e7d-783bb8052959/volumes" Jan 31 09:43:58 crc kubenswrapper[4992]: I0131 09:43:58.072376 4992 generic.go:334] "Generic (PLEG): container finished" podID="fc5e7d3d-d54b-4b87-8578-34d1764e7e0d" containerID="c8dbd09a79b362ae518f1af8233235c2d81e2132f2ca037d570628b1a9b467d4" exitCode=0 Jan 31 09:43:58 crc kubenswrapper[4992]: I0131 09:43:58.072488 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-96xss" event={"ID":"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d","Type":"ContainerDied","Data":"c8dbd09a79b362ae518f1af8233235c2d81e2132f2ca037d570628b1a9b467d4"} Jan 31 09:43:59 crc kubenswrapper[4992]: I0131 09:43:59.475034 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-96xss" Jan 31 09:43:59 crc kubenswrapper[4992]: I0131 09:43:59.618109 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-config-data\") pod \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " Jan 31 09:43:59 crc kubenswrapper[4992]: I0131 09:43:59.618536 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt6q9\" (UniqueName: \"kubernetes.io/projected/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-kube-api-access-xt6q9\") pod \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " Jan 31 09:43:59 crc kubenswrapper[4992]: I0131 09:43:59.618614 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-db-sync-config-data\") pod \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " Jan 31 09:43:59 crc kubenswrapper[4992]: I0131 09:43:59.618666 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-combined-ca-bundle\") pod \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\" (UID: \"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d\") " Jan 31 09:43:59 crc kubenswrapper[4992]: I0131 09:43:59.624606 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fc5e7d3d-d54b-4b87-8578-34d1764e7e0d" (UID: "fc5e7d3d-d54b-4b87-8578-34d1764e7e0d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:43:59 crc kubenswrapper[4992]: I0131 09:43:59.624617 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-kube-api-access-xt6q9" (OuterVolumeSpecName: "kube-api-access-xt6q9") pod "fc5e7d3d-d54b-4b87-8578-34d1764e7e0d" (UID: "fc5e7d3d-d54b-4b87-8578-34d1764e7e0d"). InnerVolumeSpecName "kube-api-access-xt6q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:43:59 crc kubenswrapper[4992]: I0131 09:43:59.645264 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc5e7d3d-d54b-4b87-8578-34d1764e7e0d" (UID: "fc5e7d3d-d54b-4b87-8578-34d1764e7e0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:43:59 crc kubenswrapper[4992]: I0131 09:43:59.680627 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-config-data" (OuterVolumeSpecName: "config-data") pod "fc5e7d3d-d54b-4b87-8578-34d1764e7e0d" (UID: "fc5e7d3d-d54b-4b87-8578-34d1764e7e0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:43:59 crc kubenswrapper[4992]: I0131 09:43:59.721491 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:59 crc kubenswrapper[4992]: I0131 09:43:59.721550 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt6q9\" (UniqueName: \"kubernetes.io/projected/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-kube-api-access-xt6q9\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:59 crc kubenswrapper[4992]: I0131 09:43:59.721575 4992 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:43:59 crc kubenswrapper[4992]: I0131 09:43:59.721596 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.093507 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-96xss" event={"ID":"fc5e7d3d-d54b-4b87-8578-34d1764e7e0d","Type":"ContainerDied","Data":"13a8490ba61fa49faac6b96588bd2c303b920bd2a8f9968ec309c4c5ada1721c"} Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.093572 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13a8490ba61fa49faac6b96588bd2c303b920bd2a8f9968ec309c4c5ada1721c" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.093742 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-96xss" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.596207 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-9nv8v"] Jan 31 09:44:00 crc kubenswrapper[4992]: E0131 09:44:00.596531 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5e7d3d-d54b-4b87-8578-34d1764e7e0d" containerName="glance-db-sync" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.596545 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5e7d3d-d54b-4b87-8578-34d1764e7e0d" containerName="glance-db-sync" Jan 31 09:44:00 crc kubenswrapper[4992]: E0131 09:44:00.596559 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cd4435-8c0c-41dc-8e7d-783bb8052959" containerName="ovn-config" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.596566 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cd4435-8c0c-41dc-8e7d-783bb8052959" containerName="ovn-config" Jan 31 09:44:00 crc kubenswrapper[4992]: E0131 09:44:00.596580 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a74fed1c-c0a3-4120-9724-50d5000661bb" containerName="mariadb-account-create-update" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.596586 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74fed1c-c0a3-4120-9724-50d5000661bb" containerName="mariadb-account-create-update" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.596723 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5e7d3d-d54b-4b87-8578-34d1764e7e0d" containerName="glance-db-sync" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.596732 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="a74fed1c-c0a3-4120-9724-50d5000661bb" containerName="mariadb-account-create-update" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.596740 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cd4435-8c0c-41dc-8e7d-783bb8052959" containerName="ovn-config" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.597619 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.612845 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-9nv8v"] Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.737872 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-dns-svc\") pod \"dnsmasq-dns-554567b4f7-9nv8v\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.738070 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-9nv8v\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.738103 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gckk\" (UniqueName: \"kubernetes.io/projected/25e5d88a-2fe1-45fe-a262-bb5ef7742563-kube-api-access-2gckk\") pod \"dnsmasq-dns-554567b4f7-9nv8v\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.738172 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-config\") pod \"dnsmasq-dns-554567b4f7-9nv8v\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.738543 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-9nv8v\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.839648 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-dns-svc\") pod \"dnsmasq-dns-554567b4f7-9nv8v\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.839726 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-9nv8v\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.839753 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gckk\" (UniqueName: \"kubernetes.io/projected/25e5d88a-2fe1-45fe-a262-bb5ef7742563-kube-api-access-2gckk\") pod \"dnsmasq-dns-554567b4f7-9nv8v\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.839803 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-config\") pod \"dnsmasq-dns-554567b4f7-9nv8v\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.839844 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-9nv8v\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.840769 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-dns-svc\") pod \"dnsmasq-dns-554567b4f7-9nv8v\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.840800 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-ovsdbserver-nb\") pod \"dnsmasq-dns-554567b4f7-9nv8v\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.840824 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-config\") pod \"dnsmasq-dns-554567b4f7-9nv8v\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.840901 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-ovsdbserver-sb\") pod \"dnsmasq-dns-554567b4f7-9nv8v\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.856539 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gckk\" (UniqueName: \"kubernetes.io/projected/25e5d88a-2fe1-45fe-a262-bb5ef7742563-kube-api-access-2gckk\") pod \"dnsmasq-dns-554567b4f7-9nv8v\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:00 crc kubenswrapper[4992]: I0131 09:44:00.919218 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:01 crc kubenswrapper[4992]: I0131 09:44:01.360640 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-9nv8v"] Jan 31 09:44:01 crc kubenswrapper[4992]: W0131 09:44:01.368779 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25e5d88a_2fe1_45fe_a262_bb5ef7742563.slice/crio-5a4d502db8c22ea4bf41071c2e5d5dc3b546a27ef85841d047ef3f7564c101e9 WatchSource:0}: Error finding container 5a4d502db8c22ea4bf41071c2e5d5dc3b546a27ef85841d047ef3f7564c101e9: Status 404 returned error can't find the container with id 5a4d502db8c22ea4bf41071c2e5d5dc3b546a27ef85841d047ef3f7564c101e9 Jan 31 09:44:02 crc kubenswrapper[4992]: I0131 09:44:02.110614 4992 generic.go:334] "Generic (PLEG): container finished" podID="25e5d88a-2fe1-45fe-a262-bb5ef7742563" containerID="fafcd8c478aa5fb4b8a11cea115a0511b557a1d7e80558d5d26bee5d9f02f0d1" exitCode=0 Jan 31 09:44:02 crc kubenswrapper[4992]: I0131 09:44:02.110713 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" event={"ID":"25e5d88a-2fe1-45fe-a262-bb5ef7742563","Type":"ContainerDied","Data":"fafcd8c478aa5fb4b8a11cea115a0511b557a1d7e80558d5d26bee5d9f02f0d1"} Jan 31 09:44:02 crc kubenswrapper[4992]: I0131 09:44:02.110964 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" event={"ID":"25e5d88a-2fe1-45fe-a262-bb5ef7742563","Type":"ContainerStarted","Data":"5a4d502db8c22ea4bf41071c2e5d5dc3b546a27ef85841d047ef3f7564c101e9"} Jan 31 09:44:03 crc kubenswrapper[4992]: I0131 09:44:03.119554 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" event={"ID":"25e5d88a-2fe1-45fe-a262-bb5ef7742563","Type":"ContainerStarted","Data":"8cda2b321cd22de41db6883e58f0e8a1afc3e092dd796236a9f3c64668bdea5f"} Jan 31 09:44:03 crc kubenswrapper[4992]: I0131 09:44:03.119821 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:03 crc kubenswrapper[4992]: I0131 09:44:03.141502 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" podStartSLOduration=3.141484448 podStartE2EDuration="3.141484448s" podCreationTimestamp="2026-01-31 09:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:44:03.13564779 +0000 UTC m=+1139.107039797" watchObservedRunningTime="2026-01-31 09:44:03.141484448 +0000 UTC m=+1139.112876435" Jan 31 09:44:07 crc kubenswrapper[4992]: I0131 09:44:07.542718 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 31 09:44:07 crc kubenswrapper[4992]: I0131 09:44:07.898067 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-sb5bv"] Jan 31 09:44:07 crc kubenswrapper[4992]: I0131 09:44:07.899021 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sb5bv" Jan 31 09:44:07 crc kubenswrapper[4992]: I0131 09:44:07.910938 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sb5bv"] Jan 31 09:44:07 crc kubenswrapper[4992]: I0131 09:44:07.916646 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:44:07 crc kubenswrapper[4992]: I0131 09:44:07.920628 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-fa36-account-create-update-cczt6"] Jan 31 09:44:07 crc kubenswrapper[4992]: I0131 09:44:07.922023 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fa36-account-create-update-cczt6" Jan 31 09:44:07 crc kubenswrapper[4992]: I0131 09:44:07.931992 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 31 09:44:07 crc kubenswrapper[4992]: I0131 09:44:07.949594 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fa36-account-create-update-cczt6"] Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.017862 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-q7bsh"] Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.019050 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-q7bsh" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.032798 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-q7bsh"] Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.074748 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f80b2ba9-24f2-44ec-a523-74a843ee40dd-operator-scripts\") pod \"cinder-db-create-sb5bv\" (UID: \"f80b2ba9-24f2-44ec-a523-74a843ee40dd\") " pod="openstack/cinder-db-create-sb5bv" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.075624 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfqpc\" (UniqueName: \"kubernetes.io/projected/ab96b3c5-39bc-40ae-a1eb-2a751e90c944-kube-api-access-pfqpc\") pod \"cinder-fa36-account-create-update-cczt6\" (UID: \"ab96b3c5-39bc-40ae-a1eb-2a751e90c944\") " pod="openstack/cinder-fa36-account-create-update-cczt6" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.075698 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwjw4\" (UniqueName: \"kubernetes.io/projected/f80b2ba9-24f2-44ec-a523-74a843ee40dd-kube-api-access-vwjw4\") pod \"cinder-db-create-sb5bv\" (UID: \"f80b2ba9-24f2-44ec-a523-74a843ee40dd\") " pod="openstack/cinder-db-create-sb5bv" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.075761 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab96b3c5-39bc-40ae-a1eb-2a751e90c944-operator-scripts\") pod \"cinder-fa36-account-create-update-cczt6\" (UID: \"ab96b3c5-39bc-40ae-a1eb-2a751e90c944\") " pod="openstack/cinder-fa36-account-create-update-cczt6" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.116427 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c015-account-create-update-lm7x4"] Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.120481 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c015-account-create-update-lm7x4" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.122917 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.138139 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c015-account-create-update-lm7x4"] Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.177606 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfqpc\" (UniqueName: \"kubernetes.io/projected/ab96b3c5-39bc-40ae-a1eb-2a751e90c944-kube-api-access-pfqpc\") pod \"cinder-fa36-account-create-update-cczt6\" (UID: \"ab96b3c5-39bc-40ae-a1eb-2a751e90c944\") " pod="openstack/cinder-fa36-account-create-update-cczt6" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.177658 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwjw4\" (UniqueName: \"kubernetes.io/projected/f80b2ba9-24f2-44ec-a523-74a843ee40dd-kube-api-access-vwjw4\") pod \"cinder-db-create-sb5bv\" (UID: \"f80b2ba9-24f2-44ec-a523-74a843ee40dd\") " pod="openstack/cinder-db-create-sb5bv" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.177719 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab96b3c5-39bc-40ae-a1eb-2a751e90c944-operator-scripts\") pod \"cinder-fa36-account-create-update-cczt6\" (UID: \"ab96b3c5-39bc-40ae-a1eb-2a751e90c944\") " pod="openstack/cinder-fa36-account-create-update-cczt6" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.177750 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f80b2ba9-24f2-44ec-a523-74a843ee40dd-operator-scripts\") pod \"cinder-db-create-sb5bv\" (UID: \"f80b2ba9-24f2-44ec-a523-74a843ee40dd\") " pod="openstack/cinder-db-create-sb5bv" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.177820 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w569d\" (UniqueName: \"kubernetes.io/projected/690c5d16-8767-4215-adcc-6c52a3f214f9-kube-api-access-w569d\") pod \"barbican-db-create-q7bsh\" (UID: \"690c5d16-8767-4215-adcc-6c52a3f214f9\") " pod="openstack/barbican-db-create-q7bsh" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.177838 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/690c5d16-8767-4215-adcc-6c52a3f214f9-operator-scripts\") pod \"barbican-db-create-q7bsh\" (UID: \"690c5d16-8767-4215-adcc-6c52a3f214f9\") " pod="openstack/barbican-db-create-q7bsh" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.178644 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f80b2ba9-24f2-44ec-a523-74a843ee40dd-operator-scripts\") pod \"cinder-db-create-sb5bv\" (UID: \"f80b2ba9-24f2-44ec-a523-74a843ee40dd\") " pod="openstack/cinder-db-create-sb5bv" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.178651 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab96b3c5-39bc-40ae-a1eb-2a751e90c944-operator-scripts\") pod \"cinder-fa36-account-create-update-cczt6\" (UID: \"ab96b3c5-39bc-40ae-a1eb-2a751e90c944\") " pod="openstack/cinder-fa36-account-create-update-cczt6" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.209294 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfqpc\" (UniqueName: \"kubernetes.io/projected/ab96b3c5-39bc-40ae-a1eb-2a751e90c944-kube-api-access-pfqpc\") pod \"cinder-fa36-account-create-update-cczt6\" (UID: \"ab96b3c5-39bc-40ae-a1eb-2a751e90c944\") " pod="openstack/cinder-fa36-account-create-update-cczt6" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.211761 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwjw4\" (UniqueName: \"kubernetes.io/projected/f80b2ba9-24f2-44ec-a523-74a843ee40dd-kube-api-access-vwjw4\") pod \"cinder-db-create-sb5bv\" (UID: \"f80b2ba9-24f2-44ec-a523-74a843ee40dd\") " pod="openstack/cinder-db-create-sb5bv" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.216261 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sb5bv" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.219807 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2fa6-account-create-update-cjr6n"] Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.220781 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2fa6-account-create-update-cjr6n" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.224814 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.233717 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2fa6-account-create-update-cjr6n"] Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.247835 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fa36-account-create-update-cczt6" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.285433 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w569d\" (UniqueName: \"kubernetes.io/projected/690c5d16-8767-4215-adcc-6c52a3f214f9-kube-api-access-w569d\") pod \"barbican-db-create-q7bsh\" (UID: \"690c5d16-8767-4215-adcc-6c52a3f214f9\") " pod="openstack/barbican-db-create-q7bsh" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.285474 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/690c5d16-8767-4215-adcc-6c52a3f214f9-operator-scripts\") pod \"barbican-db-create-q7bsh\" (UID: \"690c5d16-8767-4215-adcc-6c52a3f214f9\") " pod="openstack/barbican-db-create-q7bsh" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.285507 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9x7n\" (UniqueName: \"kubernetes.io/projected/d23756fe-5e0d-43f4-a977-a9058b096998-kube-api-access-f9x7n\") pod \"neutron-c015-account-create-update-lm7x4\" (UID: \"d23756fe-5e0d-43f4-a977-a9058b096998\") " pod="openstack/neutron-c015-account-create-update-lm7x4" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.285549 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d23756fe-5e0d-43f4-a977-a9058b096998-operator-scripts\") pod \"neutron-c015-account-create-update-lm7x4\" (UID: \"d23756fe-5e0d-43f4-a977-a9058b096998\") " pod="openstack/neutron-c015-account-create-update-lm7x4" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.286399 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/690c5d16-8767-4215-adcc-6c52a3f214f9-operator-scripts\") pod \"barbican-db-create-q7bsh\" (UID: \"690c5d16-8767-4215-adcc-6c52a3f214f9\") " pod="openstack/barbican-db-create-q7bsh" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.309862 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-tfxcr"] Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.310871 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tfxcr" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.331231 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-lw589"] Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.332525 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lw589" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.337965 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w569d\" (UniqueName: \"kubernetes.io/projected/690c5d16-8767-4215-adcc-6c52a3f214f9-kube-api-access-w569d\") pod \"barbican-db-create-q7bsh\" (UID: \"690c5d16-8767-4215-adcc-6c52a3f214f9\") " pod="openstack/barbican-db-create-q7bsh" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.339780 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-q7bsh" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.340711 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.340927 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.341470 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.342074 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gkf5n" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.346374 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tfxcr"] Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.370548 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-lw589"] Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.388157 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d4c3a5c-ab08-4b94-877e-73e2641429d4-operator-scripts\") pod \"barbican-2fa6-account-create-update-cjr6n\" (UID: \"3d4c3a5c-ab08-4b94-877e-73e2641429d4\") " pod="openstack/barbican-2fa6-account-create-update-cjr6n" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.388207 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6x2m\" (UniqueName: \"kubernetes.io/projected/35185358-7b72-425e-af0b-c52b7887ce93-kube-api-access-q6x2m\") pod \"neutron-db-create-tfxcr\" (UID: \"35185358-7b72-425e-af0b-c52b7887ce93\") " pod="openstack/neutron-db-create-tfxcr" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.388242 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9x7n\" (UniqueName: \"kubernetes.io/projected/d23756fe-5e0d-43f4-a977-a9058b096998-kube-api-access-f9x7n\") pod \"neutron-c015-account-create-update-lm7x4\" (UID: \"d23756fe-5e0d-43f4-a977-a9058b096998\") " pod="openstack/neutron-c015-account-create-update-lm7x4" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.388271 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d23756fe-5e0d-43f4-a977-a9058b096998-operator-scripts\") pod \"neutron-c015-account-create-update-lm7x4\" (UID: \"d23756fe-5e0d-43f4-a977-a9058b096998\") " pod="openstack/neutron-c015-account-create-update-lm7x4" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.388304 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r82jt\" (UniqueName: \"kubernetes.io/projected/3d4c3a5c-ab08-4b94-877e-73e2641429d4-kube-api-access-r82jt\") pod \"barbican-2fa6-account-create-update-cjr6n\" (UID: \"3d4c3a5c-ab08-4b94-877e-73e2641429d4\") " pod="openstack/barbican-2fa6-account-create-update-cjr6n" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.388392 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35185358-7b72-425e-af0b-c52b7887ce93-operator-scripts\") pod \"neutron-db-create-tfxcr\" (UID: \"35185358-7b72-425e-af0b-c52b7887ce93\") " pod="openstack/neutron-db-create-tfxcr" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.389357 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d23756fe-5e0d-43f4-a977-a9058b096998-operator-scripts\") pod \"neutron-c015-account-create-update-lm7x4\" (UID: \"d23756fe-5e0d-43f4-a977-a9058b096998\") " pod="openstack/neutron-c015-account-create-update-lm7x4" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.436611 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9x7n\" (UniqueName: \"kubernetes.io/projected/d23756fe-5e0d-43f4-a977-a9058b096998-kube-api-access-f9x7n\") pod \"neutron-c015-account-create-update-lm7x4\" (UID: \"d23756fe-5e0d-43f4-a977-a9058b096998\") " pod="openstack/neutron-c015-account-create-update-lm7x4" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.440526 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c015-account-create-update-lm7x4" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.493479 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xdbj\" (UniqueName: \"kubernetes.io/projected/9ac29a1f-926d-44c3-b380-4f48340ad9ce-kube-api-access-2xdbj\") pod \"keystone-db-sync-lw589\" (UID: \"9ac29a1f-926d-44c3-b380-4f48340ad9ce\") " pod="openstack/keystone-db-sync-lw589" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.493853 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d4c3a5c-ab08-4b94-877e-73e2641429d4-operator-scripts\") pod \"barbican-2fa6-account-create-update-cjr6n\" (UID: \"3d4c3a5c-ab08-4b94-877e-73e2641429d4\") " pod="openstack/barbican-2fa6-account-create-update-cjr6n" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.493892 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6x2m\" (UniqueName: \"kubernetes.io/projected/35185358-7b72-425e-af0b-c52b7887ce93-kube-api-access-q6x2m\") pod \"neutron-db-create-tfxcr\" (UID: \"35185358-7b72-425e-af0b-c52b7887ce93\") " pod="openstack/neutron-db-create-tfxcr" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.493985 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r82jt\" (UniqueName: \"kubernetes.io/projected/3d4c3a5c-ab08-4b94-877e-73e2641429d4-kube-api-access-r82jt\") pod \"barbican-2fa6-account-create-update-cjr6n\" (UID: \"3d4c3a5c-ab08-4b94-877e-73e2641429d4\") " pod="openstack/barbican-2fa6-account-create-update-cjr6n" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.494042 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac29a1f-926d-44c3-b380-4f48340ad9ce-combined-ca-bundle\") pod \"keystone-db-sync-lw589\" (UID: \"9ac29a1f-926d-44c3-b380-4f48340ad9ce\") " pod="openstack/keystone-db-sync-lw589" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.494124 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ac29a1f-926d-44c3-b380-4f48340ad9ce-config-data\") pod \"keystone-db-sync-lw589\" (UID: \"9ac29a1f-926d-44c3-b380-4f48340ad9ce\") " pod="openstack/keystone-db-sync-lw589" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.494161 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35185358-7b72-425e-af0b-c52b7887ce93-operator-scripts\") pod \"neutron-db-create-tfxcr\" (UID: \"35185358-7b72-425e-af0b-c52b7887ce93\") " pod="openstack/neutron-db-create-tfxcr" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.494956 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35185358-7b72-425e-af0b-c52b7887ce93-operator-scripts\") pod \"neutron-db-create-tfxcr\" (UID: \"35185358-7b72-425e-af0b-c52b7887ce93\") " pod="openstack/neutron-db-create-tfxcr" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.496318 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d4c3a5c-ab08-4b94-877e-73e2641429d4-operator-scripts\") pod \"barbican-2fa6-account-create-update-cjr6n\" (UID: \"3d4c3a5c-ab08-4b94-877e-73e2641429d4\") " pod="openstack/barbican-2fa6-account-create-update-cjr6n" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.514161 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6x2m\" (UniqueName: \"kubernetes.io/projected/35185358-7b72-425e-af0b-c52b7887ce93-kube-api-access-q6x2m\") pod \"neutron-db-create-tfxcr\" (UID: \"35185358-7b72-425e-af0b-c52b7887ce93\") " pod="openstack/neutron-db-create-tfxcr" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.518284 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r82jt\" (UniqueName: \"kubernetes.io/projected/3d4c3a5c-ab08-4b94-877e-73e2641429d4-kube-api-access-r82jt\") pod \"barbican-2fa6-account-create-update-cjr6n\" (UID: \"3d4c3a5c-ab08-4b94-877e-73e2641429d4\") " pod="openstack/barbican-2fa6-account-create-update-cjr6n" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.595338 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ac29a1f-926d-44c3-b380-4f48340ad9ce-config-data\") pod \"keystone-db-sync-lw589\" (UID: \"9ac29a1f-926d-44c3-b380-4f48340ad9ce\") " pod="openstack/keystone-db-sync-lw589" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.595410 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xdbj\" (UniqueName: \"kubernetes.io/projected/9ac29a1f-926d-44c3-b380-4f48340ad9ce-kube-api-access-2xdbj\") pod \"keystone-db-sync-lw589\" (UID: \"9ac29a1f-926d-44c3-b380-4f48340ad9ce\") " pod="openstack/keystone-db-sync-lw589" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.595525 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac29a1f-926d-44c3-b380-4f48340ad9ce-combined-ca-bundle\") pod \"keystone-db-sync-lw589\" (UID: \"9ac29a1f-926d-44c3-b380-4f48340ad9ce\") " pod="openstack/keystone-db-sync-lw589" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.599931 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ac29a1f-926d-44c3-b380-4f48340ad9ce-config-data\") pod \"keystone-db-sync-lw589\" (UID: \"9ac29a1f-926d-44c3-b380-4f48340ad9ce\") " pod="openstack/keystone-db-sync-lw589" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.612175 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac29a1f-926d-44c3-b380-4f48340ad9ce-combined-ca-bundle\") pod \"keystone-db-sync-lw589\" (UID: \"9ac29a1f-926d-44c3-b380-4f48340ad9ce\") " pod="openstack/keystone-db-sync-lw589" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.624533 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xdbj\" (UniqueName: \"kubernetes.io/projected/9ac29a1f-926d-44c3-b380-4f48340ad9ce-kube-api-access-2xdbj\") pod \"keystone-db-sync-lw589\" (UID: \"9ac29a1f-926d-44c3-b380-4f48340ad9ce\") " pod="openstack/keystone-db-sync-lw589" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.707786 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2fa6-account-create-update-cjr6n" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.745022 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tfxcr" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.765632 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lw589" Jan 31 09:44:08 crc kubenswrapper[4992]: I0131 09:44:08.837236 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sb5bv"] Jan 31 09:44:08 crc kubenswrapper[4992]: W0131 09:44:08.841562 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf80b2ba9_24f2_44ec_a523_74a843ee40dd.slice/crio-57342c4f623daf45e9d8a83f27111262026797c65928cc7e1cd9e942100eda5c WatchSource:0}: Error finding container 57342c4f623daf45e9d8a83f27111262026797c65928cc7e1cd9e942100eda5c: Status 404 returned error can't find the container with id 57342c4f623daf45e9d8a83f27111262026797c65928cc7e1cd9e942100eda5c Jan 31 09:44:09 crc kubenswrapper[4992]: I0131 09:44:09.060340 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fa36-account-create-update-cczt6"] Jan 31 09:44:09 crc kubenswrapper[4992]: I0131 09:44:09.073243 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-q7bsh"] Jan 31 09:44:09 crc kubenswrapper[4992]: I0131 09:44:09.173081 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sb5bv" event={"ID":"f80b2ba9-24f2-44ec-a523-74a843ee40dd","Type":"ContainerStarted","Data":"57342c4f623daf45e9d8a83f27111262026797c65928cc7e1cd9e942100eda5c"} Jan 31 09:44:09 crc kubenswrapper[4992]: I0131 09:44:09.173969 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-q7bsh" event={"ID":"690c5d16-8767-4215-adcc-6c52a3f214f9","Type":"ContainerStarted","Data":"125de38f99c484fd0a9fb5c0e7ca12ad69f35efe7657da70800dda0781bd49f7"} Jan 31 09:44:09 crc kubenswrapper[4992]: I0131 09:44:09.256282 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fa36-account-create-update-cczt6" event={"ID":"ab96b3c5-39bc-40ae-a1eb-2a751e90c944","Type":"ContainerStarted","Data":"52a0ad1a4a672112d2e171619760dad71636ee44edab2b40d05fb2b92d747a57"} Jan 31 09:44:09 crc kubenswrapper[4992]: I0131 09:44:09.258495 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c015-account-create-update-lm7x4"] Jan 31 09:44:09 crc kubenswrapper[4992]: I0131 09:44:09.410635 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2fa6-account-create-update-cjr6n"] Jan 31 09:44:09 crc kubenswrapper[4992]: W0131 09:44:09.424986 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d4c3a5c_ab08_4b94_877e_73e2641429d4.slice/crio-9cc39bbef945b6514ae6eddb0ea176371f91fed7af65bf05b385f827d608b9b1 WatchSource:0}: Error finding container 9cc39bbef945b6514ae6eddb0ea176371f91fed7af65bf05b385f827d608b9b1: Status 404 returned error can't find the container with id 9cc39bbef945b6514ae6eddb0ea176371f91fed7af65bf05b385f827d608b9b1 Jan 31 09:44:09 crc kubenswrapper[4992]: I0131 09:44:09.605132 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tfxcr"] Jan 31 09:44:09 crc kubenswrapper[4992]: I0131 09:44:09.613505 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-lw589"] Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.247519 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lw589" event={"ID":"9ac29a1f-926d-44c3-b380-4f48340ad9ce","Type":"ContainerStarted","Data":"6e62146682284c2f524ada5818bda8ac13ce5ee8ac4ada19644702254639f3c9"} Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.249108 4992 generic.go:334] "Generic (PLEG): container finished" podID="ab96b3c5-39bc-40ae-a1eb-2a751e90c944" containerID="20cfb4142b2181223dbe1e2a0e38750978856209645f96870a80021175c07ee8" exitCode=0 Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.249184 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fa36-account-create-update-cczt6" event={"ID":"ab96b3c5-39bc-40ae-a1eb-2a751e90c944","Type":"ContainerDied","Data":"20cfb4142b2181223dbe1e2a0e38750978856209645f96870a80021175c07ee8"} Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.250664 4992 generic.go:334] "Generic (PLEG): container finished" podID="3d4c3a5c-ab08-4b94-877e-73e2641429d4" containerID="cc7fd2a0dd88b452290fbb5cc0e910cd2b7723f244931e175efb443ec0d57191" exitCode=0 Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.250725 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2fa6-account-create-update-cjr6n" event={"ID":"3d4c3a5c-ab08-4b94-877e-73e2641429d4","Type":"ContainerDied","Data":"cc7fd2a0dd88b452290fbb5cc0e910cd2b7723f244931e175efb443ec0d57191"} Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.250750 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2fa6-account-create-update-cjr6n" event={"ID":"3d4c3a5c-ab08-4b94-877e-73e2641429d4","Type":"ContainerStarted","Data":"9cc39bbef945b6514ae6eddb0ea176371f91fed7af65bf05b385f827d608b9b1"} Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.256049 4992 generic.go:334] "Generic (PLEG): container finished" podID="d23756fe-5e0d-43f4-a977-a9058b096998" containerID="e571af91bd1193e5938ccaa793ae49bc150646e1a06122e8f7850f386fb01fb9" exitCode=0 Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.256115 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c015-account-create-update-lm7x4" event={"ID":"d23756fe-5e0d-43f4-a977-a9058b096998","Type":"ContainerDied","Data":"e571af91bd1193e5938ccaa793ae49bc150646e1a06122e8f7850f386fb01fb9"} Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.256134 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c015-account-create-update-lm7x4" event={"ID":"d23756fe-5e0d-43f4-a977-a9058b096998","Type":"ContainerStarted","Data":"5ec8bb6e69e8f45ca99a4abf0fb1855715aab7db03e84f4e2d1a659897f29245"} Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.258522 4992 generic.go:334] "Generic (PLEG): container finished" podID="35185358-7b72-425e-af0b-c52b7887ce93" containerID="b6b8f0fb362912d1a816259fb0b3d0d5f42c08100b7fbb93c4a53af70ba6456b" exitCode=0 Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.258651 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tfxcr" event={"ID":"35185358-7b72-425e-af0b-c52b7887ce93","Type":"ContainerDied","Data":"b6b8f0fb362912d1a816259fb0b3d0d5f42c08100b7fbb93c4a53af70ba6456b"} Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.258681 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tfxcr" event={"ID":"35185358-7b72-425e-af0b-c52b7887ce93","Type":"ContainerStarted","Data":"3edd31cc314112bdade4f8283c1c7d6875941ce00f0ba495893bb27399c7eaa0"} Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.260636 4992 generic.go:334] "Generic (PLEG): container finished" podID="f80b2ba9-24f2-44ec-a523-74a843ee40dd" containerID="4c762f2076813a7d523ae8961273df420ec2c0ed0d7ef616a69860a461c91864" exitCode=0 Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.260835 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sb5bv" event={"ID":"f80b2ba9-24f2-44ec-a523-74a843ee40dd","Type":"ContainerDied","Data":"4c762f2076813a7d523ae8961273df420ec2c0ed0d7ef616a69860a461c91864"} Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.262381 4992 generic.go:334] "Generic (PLEG): container finished" podID="690c5d16-8767-4215-adcc-6c52a3f214f9" containerID="f53b8af4592fabf7cc60bff83aca365372f9799cdb7b03014b435fc1c76e961f" exitCode=0 Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.262434 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-q7bsh" event={"ID":"690c5d16-8767-4215-adcc-6c52a3f214f9","Type":"ContainerDied","Data":"f53b8af4592fabf7cc60bff83aca365372f9799cdb7b03014b435fc1c76e961f"} Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.920617 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.975838 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-gwvqv"] Jan 31 09:44:10 crc kubenswrapper[4992]: I0131 09:44:10.976991 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-gwvqv" podUID="71b91424-3380-42f6-a2a3-edcb31b2eee2" containerName="dnsmasq-dns" containerID="cri-o://4e0af4a41c8debfa41dd5f7c5e3f6445b561fd01ecb9c6be9daa74e4832aeb73" gracePeriod=10 Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.292388 4992 generic.go:334] "Generic (PLEG): container finished" podID="71b91424-3380-42f6-a2a3-edcb31b2eee2" containerID="4e0af4a41c8debfa41dd5f7c5e3f6445b561fd01ecb9c6be9daa74e4832aeb73" exitCode=0 Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.292568 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-gwvqv" event={"ID":"71b91424-3380-42f6-a2a3-edcb31b2eee2","Type":"ContainerDied","Data":"4e0af4a41c8debfa41dd5f7c5e3f6445b561fd01ecb9c6be9daa74e4832aeb73"} Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.436882 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.572910 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-dns-svc\") pod \"71b91424-3380-42f6-a2a3-edcb31b2eee2\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.573005 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49t8d\" (UniqueName: \"kubernetes.io/projected/71b91424-3380-42f6-a2a3-edcb31b2eee2-kube-api-access-49t8d\") pod \"71b91424-3380-42f6-a2a3-edcb31b2eee2\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.573058 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-config\") pod \"71b91424-3380-42f6-a2a3-edcb31b2eee2\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.573092 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-ovsdbserver-nb\") pod \"71b91424-3380-42f6-a2a3-edcb31b2eee2\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.573123 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-ovsdbserver-sb\") pod \"71b91424-3380-42f6-a2a3-edcb31b2eee2\" (UID: \"71b91424-3380-42f6-a2a3-edcb31b2eee2\") " Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.595983 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b91424-3380-42f6-a2a3-edcb31b2eee2-kube-api-access-49t8d" (OuterVolumeSpecName: "kube-api-access-49t8d") pod "71b91424-3380-42f6-a2a3-edcb31b2eee2" (UID: "71b91424-3380-42f6-a2a3-edcb31b2eee2"). InnerVolumeSpecName "kube-api-access-49t8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.627864 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "71b91424-3380-42f6-a2a3-edcb31b2eee2" (UID: "71b91424-3380-42f6-a2a3-edcb31b2eee2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.637835 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "71b91424-3380-42f6-a2a3-edcb31b2eee2" (UID: "71b91424-3380-42f6-a2a3-edcb31b2eee2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.642160 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "71b91424-3380-42f6-a2a3-edcb31b2eee2" (UID: "71b91424-3380-42f6-a2a3-edcb31b2eee2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.646533 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-config" (OuterVolumeSpecName: "config") pod "71b91424-3380-42f6-a2a3-edcb31b2eee2" (UID: "71b91424-3380-42f6-a2a3-edcb31b2eee2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.688622 4992 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.688903 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49t8d\" (UniqueName: \"kubernetes.io/projected/71b91424-3380-42f6-a2a3-edcb31b2eee2-kube-api-access-49t8d\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.688995 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.689079 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.689192 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71b91424-3380-42f6-a2a3-edcb31b2eee2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.761526 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sb5bv" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.893180 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwjw4\" (UniqueName: \"kubernetes.io/projected/f80b2ba9-24f2-44ec-a523-74a843ee40dd-kube-api-access-vwjw4\") pod \"f80b2ba9-24f2-44ec-a523-74a843ee40dd\" (UID: \"f80b2ba9-24f2-44ec-a523-74a843ee40dd\") " Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.893254 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f80b2ba9-24f2-44ec-a523-74a843ee40dd-operator-scripts\") pod \"f80b2ba9-24f2-44ec-a523-74a843ee40dd\" (UID: \"f80b2ba9-24f2-44ec-a523-74a843ee40dd\") " Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.894030 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80b2ba9-24f2-44ec-a523-74a843ee40dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f80b2ba9-24f2-44ec-a523-74a843ee40dd" (UID: "f80b2ba9-24f2-44ec-a523-74a843ee40dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.898494 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80b2ba9-24f2-44ec-a523-74a843ee40dd-kube-api-access-vwjw4" (OuterVolumeSpecName: "kube-api-access-vwjw4") pod "f80b2ba9-24f2-44ec-a523-74a843ee40dd" (UID: "f80b2ba9-24f2-44ec-a523-74a843ee40dd"). InnerVolumeSpecName "kube-api-access-vwjw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.902994 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tfxcr" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.932627 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2fa6-account-create-update-cjr6n" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.958999 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-q7bsh" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.968433 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c015-account-create-update-lm7x4" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.988765 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fa36-account-create-update-cczt6" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.994933 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6x2m\" (UniqueName: \"kubernetes.io/projected/35185358-7b72-425e-af0b-c52b7887ce93-kube-api-access-q6x2m\") pod \"35185358-7b72-425e-af0b-c52b7887ce93\" (UID: \"35185358-7b72-425e-af0b-c52b7887ce93\") " Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.995090 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35185358-7b72-425e-af0b-c52b7887ce93-operator-scripts\") pod \"35185358-7b72-425e-af0b-c52b7887ce93\" (UID: \"35185358-7b72-425e-af0b-c52b7887ce93\") " Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.995483 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwjw4\" (UniqueName: \"kubernetes.io/projected/f80b2ba9-24f2-44ec-a523-74a843ee40dd-kube-api-access-vwjw4\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.995508 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f80b2ba9-24f2-44ec-a523-74a843ee40dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.995918 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35185358-7b72-425e-af0b-c52b7887ce93-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35185358-7b72-425e-af0b-c52b7887ce93" (UID: "35185358-7b72-425e-af0b-c52b7887ce93"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:11 crc kubenswrapper[4992]: I0131 09:44:11.998207 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35185358-7b72-425e-af0b-c52b7887ce93-kube-api-access-q6x2m" (OuterVolumeSpecName: "kube-api-access-q6x2m") pod "35185358-7b72-425e-af0b-c52b7887ce93" (UID: "35185358-7b72-425e-af0b-c52b7887ce93"). InnerVolumeSpecName "kube-api-access-q6x2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.098142 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r82jt\" (UniqueName: \"kubernetes.io/projected/3d4c3a5c-ab08-4b94-877e-73e2641429d4-kube-api-access-r82jt\") pod \"3d4c3a5c-ab08-4b94-877e-73e2641429d4\" (UID: \"3d4c3a5c-ab08-4b94-877e-73e2641429d4\") " Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.098208 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab96b3c5-39bc-40ae-a1eb-2a751e90c944-operator-scripts\") pod \"ab96b3c5-39bc-40ae-a1eb-2a751e90c944\" (UID: \"ab96b3c5-39bc-40ae-a1eb-2a751e90c944\") " Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.098269 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d4c3a5c-ab08-4b94-877e-73e2641429d4-operator-scripts\") pod \"3d4c3a5c-ab08-4b94-877e-73e2641429d4\" (UID: \"3d4c3a5c-ab08-4b94-877e-73e2641429d4\") " Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.098313 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9x7n\" (UniqueName: \"kubernetes.io/projected/d23756fe-5e0d-43f4-a977-a9058b096998-kube-api-access-f9x7n\") pod \"d23756fe-5e0d-43f4-a977-a9058b096998\" (UID: \"d23756fe-5e0d-43f4-a977-a9058b096998\") " Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.098378 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d23756fe-5e0d-43f4-a977-a9058b096998-operator-scripts\") pod \"d23756fe-5e0d-43f4-a977-a9058b096998\" (UID: \"d23756fe-5e0d-43f4-a977-a9058b096998\") " Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.098450 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w569d\" (UniqueName: \"kubernetes.io/projected/690c5d16-8767-4215-adcc-6c52a3f214f9-kube-api-access-w569d\") pod \"690c5d16-8767-4215-adcc-6c52a3f214f9\" (UID: \"690c5d16-8767-4215-adcc-6c52a3f214f9\") " Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.098573 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/690c5d16-8767-4215-adcc-6c52a3f214f9-operator-scripts\") pod \"690c5d16-8767-4215-adcc-6c52a3f214f9\" (UID: \"690c5d16-8767-4215-adcc-6c52a3f214f9\") " Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.098612 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfqpc\" (UniqueName: \"kubernetes.io/projected/ab96b3c5-39bc-40ae-a1eb-2a751e90c944-kube-api-access-pfqpc\") pod \"ab96b3c5-39bc-40ae-a1eb-2a751e90c944\" (UID: \"ab96b3c5-39bc-40ae-a1eb-2a751e90c944\") " Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.099111 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35185358-7b72-425e-af0b-c52b7887ce93-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.099137 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6x2m\" (UniqueName: \"kubernetes.io/projected/35185358-7b72-425e-af0b-c52b7887ce93-kube-api-access-q6x2m\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.099210 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab96b3c5-39bc-40ae-a1eb-2a751e90c944-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ab96b3c5-39bc-40ae-a1eb-2a751e90c944" (UID: "ab96b3c5-39bc-40ae-a1eb-2a751e90c944"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.099576 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d4c3a5c-ab08-4b94-877e-73e2641429d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d4c3a5c-ab08-4b94-877e-73e2641429d4" (UID: "3d4c3a5c-ab08-4b94-877e-73e2641429d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.099773 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d23756fe-5e0d-43f4-a977-a9058b096998-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d23756fe-5e0d-43f4-a977-a9058b096998" (UID: "d23756fe-5e0d-43f4-a977-a9058b096998"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.100317 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/690c5d16-8767-4215-adcc-6c52a3f214f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "690c5d16-8767-4215-adcc-6c52a3f214f9" (UID: "690c5d16-8767-4215-adcc-6c52a3f214f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.101638 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d4c3a5c-ab08-4b94-877e-73e2641429d4-kube-api-access-r82jt" (OuterVolumeSpecName: "kube-api-access-r82jt") pod "3d4c3a5c-ab08-4b94-877e-73e2641429d4" (UID: "3d4c3a5c-ab08-4b94-877e-73e2641429d4"). InnerVolumeSpecName "kube-api-access-r82jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.104630 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d23756fe-5e0d-43f4-a977-a9058b096998-kube-api-access-f9x7n" (OuterVolumeSpecName: "kube-api-access-f9x7n") pod "d23756fe-5e0d-43f4-a977-a9058b096998" (UID: "d23756fe-5e0d-43f4-a977-a9058b096998"). InnerVolumeSpecName "kube-api-access-f9x7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.104969 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/690c5d16-8767-4215-adcc-6c52a3f214f9-kube-api-access-w569d" (OuterVolumeSpecName: "kube-api-access-w569d") pod "690c5d16-8767-4215-adcc-6c52a3f214f9" (UID: "690c5d16-8767-4215-adcc-6c52a3f214f9"). InnerVolumeSpecName "kube-api-access-w569d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.105060 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab96b3c5-39bc-40ae-a1eb-2a751e90c944-kube-api-access-pfqpc" (OuterVolumeSpecName: "kube-api-access-pfqpc") pod "ab96b3c5-39bc-40ae-a1eb-2a751e90c944" (UID: "ab96b3c5-39bc-40ae-a1eb-2a751e90c944"). InnerVolumeSpecName "kube-api-access-pfqpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.200956 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/690c5d16-8767-4215-adcc-6c52a3f214f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.201001 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfqpc\" (UniqueName: \"kubernetes.io/projected/ab96b3c5-39bc-40ae-a1eb-2a751e90c944-kube-api-access-pfqpc\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.201019 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r82jt\" (UniqueName: \"kubernetes.io/projected/3d4c3a5c-ab08-4b94-877e-73e2641429d4-kube-api-access-r82jt\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.201031 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab96b3c5-39bc-40ae-a1eb-2a751e90c944-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.201043 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d4c3a5c-ab08-4b94-877e-73e2641429d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.201056 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9x7n\" (UniqueName: \"kubernetes.io/projected/d23756fe-5e0d-43f4-a977-a9058b096998-kube-api-access-f9x7n\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.201068 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d23756fe-5e0d-43f4-a977-a9058b096998-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.201079 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w569d\" (UniqueName: \"kubernetes.io/projected/690c5d16-8767-4215-adcc-6c52a3f214f9-kube-api-access-w569d\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.304822 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fa36-account-create-update-cczt6" event={"ID":"ab96b3c5-39bc-40ae-a1eb-2a751e90c944","Type":"ContainerDied","Data":"52a0ad1a4a672112d2e171619760dad71636ee44edab2b40d05fb2b92d747a57"} Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.304848 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fa36-account-create-update-cczt6" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.304864 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52a0ad1a4a672112d2e171619760dad71636ee44edab2b40d05fb2b92d747a57" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.309160 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2fa6-account-create-update-cjr6n" event={"ID":"3d4c3a5c-ab08-4b94-877e-73e2641429d4","Type":"ContainerDied","Data":"9cc39bbef945b6514ae6eddb0ea176371f91fed7af65bf05b385f827d608b9b1"} Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.309204 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cc39bbef945b6514ae6eddb0ea176371f91fed7af65bf05b385f827d608b9b1" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.309195 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2fa6-account-create-update-cjr6n" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.312251 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-gwvqv" event={"ID":"71b91424-3380-42f6-a2a3-edcb31b2eee2","Type":"ContainerDied","Data":"f8b849c3e0338d516f6b5d2adae42adb5435a516649c19236f07d7bfc0da7c16"} Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.312306 4992 scope.go:117] "RemoveContainer" containerID="4e0af4a41c8debfa41dd5f7c5e3f6445b561fd01ecb9c6be9daa74e4832aeb73" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.312269 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-gwvqv" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.319456 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c015-account-create-update-lm7x4" event={"ID":"d23756fe-5e0d-43f4-a977-a9058b096998","Type":"ContainerDied","Data":"5ec8bb6e69e8f45ca99a4abf0fb1855715aab7db03e84f4e2d1a659897f29245"} Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.319493 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ec8bb6e69e8f45ca99a4abf0fb1855715aab7db03e84f4e2d1a659897f29245" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.319471 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c015-account-create-update-lm7x4" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.323048 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tfxcr" event={"ID":"35185358-7b72-425e-af0b-c52b7887ce93","Type":"ContainerDied","Data":"3edd31cc314112bdade4f8283c1c7d6875941ce00f0ba495893bb27399c7eaa0"} Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.323088 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3edd31cc314112bdade4f8283c1c7d6875941ce00f0ba495893bb27399c7eaa0" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.323093 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tfxcr" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.327974 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sb5bv" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.327978 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sb5bv" event={"ID":"f80b2ba9-24f2-44ec-a523-74a843ee40dd","Type":"ContainerDied","Data":"57342c4f623daf45e9d8a83f27111262026797c65928cc7e1cd9e942100eda5c"} Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.328040 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57342c4f623daf45e9d8a83f27111262026797c65928cc7e1cd9e942100eda5c" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.331594 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-q7bsh" event={"ID":"690c5d16-8767-4215-adcc-6c52a3f214f9","Type":"ContainerDied","Data":"125de38f99c484fd0a9fb5c0e7ca12ad69f35efe7657da70800dda0781bd49f7"} Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.331633 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="125de38f99c484fd0a9fb5c0e7ca12ad69f35efe7657da70800dda0781bd49f7" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.331660 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-q7bsh" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.345910 4992 scope.go:117] "RemoveContainer" containerID="0eaa5f1b2fc1e5e9a5dd192d57deec6fa5a68e863110ed44bbf435c50ddaf5d0" Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.461110 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-gwvqv"] Jan 31 09:44:12 crc kubenswrapper[4992]: I0131 09:44:12.468133 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-gwvqv"] Jan 31 09:44:13 crc kubenswrapper[4992]: I0131 09:44:13.194778 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b91424-3380-42f6-a2a3-edcb31b2eee2" path="/var/lib/kubelet/pods/71b91424-3380-42f6-a2a3-edcb31b2eee2/volumes" Jan 31 09:44:17 crc kubenswrapper[4992]: I0131 09:44:17.370302 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lw589" event={"ID":"9ac29a1f-926d-44c3-b380-4f48340ad9ce","Type":"ContainerStarted","Data":"9e44dbfa9195a0240df096afe4c92d21c3e0bcab9078bdb9ca7f5640b797ef34"} Jan 31 09:44:17 crc kubenswrapper[4992]: I0131 09:44:17.389549 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-lw589" podStartSLOduration=2.433491549 podStartE2EDuration="9.389531927s" podCreationTimestamp="2026-01-31 09:44:08 +0000 UTC" firstStartedPulling="2026-01-31 09:44:09.618087871 +0000 UTC m=+1145.589479858" lastFinishedPulling="2026-01-31 09:44:16.574128249 +0000 UTC m=+1152.545520236" observedRunningTime="2026-01-31 09:44:17.386791267 +0000 UTC m=+1153.358183274" watchObservedRunningTime="2026-01-31 09:44:17.389531927 +0000 UTC m=+1153.360923934" Jan 31 09:44:21 crc kubenswrapper[4992]: I0131 09:44:21.416970 4992 generic.go:334] "Generic (PLEG): container finished" podID="9ac29a1f-926d-44c3-b380-4f48340ad9ce" containerID="9e44dbfa9195a0240df096afe4c92d21c3e0bcab9078bdb9ca7f5640b797ef34" exitCode=0 Jan 31 09:44:21 crc kubenswrapper[4992]: I0131 09:44:21.417073 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lw589" event={"ID":"9ac29a1f-926d-44c3-b380-4f48340ad9ce","Type":"ContainerDied","Data":"9e44dbfa9195a0240df096afe4c92d21c3e0bcab9078bdb9ca7f5640b797ef34"} Jan 31 09:44:22 crc kubenswrapper[4992]: I0131 09:44:22.807997 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lw589" Jan 31 09:44:22 crc kubenswrapper[4992]: I0131 09:44:22.988190 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac29a1f-926d-44c3-b380-4f48340ad9ce-combined-ca-bundle\") pod \"9ac29a1f-926d-44c3-b380-4f48340ad9ce\" (UID: \"9ac29a1f-926d-44c3-b380-4f48340ad9ce\") " Jan 31 09:44:22 crc kubenswrapper[4992]: I0131 09:44:22.988377 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xdbj\" (UniqueName: \"kubernetes.io/projected/9ac29a1f-926d-44c3-b380-4f48340ad9ce-kube-api-access-2xdbj\") pod \"9ac29a1f-926d-44c3-b380-4f48340ad9ce\" (UID: \"9ac29a1f-926d-44c3-b380-4f48340ad9ce\") " Jan 31 09:44:22 crc kubenswrapper[4992]: I0131 09:44:22.988605 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ac29a1f-926d-44c3-b380-4f48340ad9ce-config-data\") pod \"9ac29a1f-926d-44c3-b380-4f48340ad9ce\" (UID: \"9ac29a1f-926d-44c3-b380-4f48340ad9ce\") " Jan 31 09:44:22 crc kubenswrapper[4992]: I0131 09:44:22.995890 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac29a1f-926d-44c3-b380-4f48340ad9ce-kube-api-access-2xdbj" (OuterVolumeSpecName: "kube-api-access-2xdbj") pod "9ac29a1f-926d-44c3-b380-4f48340ad9ce" (UID: "9ac29a1f-926d-44c3-b380-4f48340ad9ce"). InnerVolumeSpecName "kube-api-access-2xdbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.032241 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac29a1f-926d-44c3-b380-4f48340ad9ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ac29a1f-926d-44c3-b380-4f48340ad9ce" (UID: "9ac29a1f-926d-44c3-b380-4f48340ad9ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.037764 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ac29a1f-926d-44c3-b380-4f48340ad9ce-config-data" (OuterVolumeSpecName: "config-data") pod "9ac29a1f-926d-44c3-b380-4f48340ad9ce" (UID: "9ac29a1f-926d-44c3-b380-4f48340ad9ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.090540 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xdbj\" (UniqueName: \"kubernetes.io/projected/9ac29a1f-926d-44c3-b380-4f48340ad9ce-kube-api-access-2xdbj\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.090574 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ac29a1f-926d-44c3-b380-4f48340ad9ce-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.090583 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ac29a1f-926d-44c3-b380-4f48340ad9ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.433673 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-lw589" event={"ID":"9ac29a1f-926d-44c3-b380-4f48340ad9ce","Type":"ContainerDied","Data":"6e62146682284c2f524ada5818bda8ac13ce5ee8ac4ada19644702254639f3c9"} Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.433902 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e62146682284c2f524ada5818bda8ac13ce5ee8ac4ada19644702254639f3c9" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.433762 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-lw589" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602300 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67795cd9-wbr5n"] Jan 31 09:44:23 crc kubenswrapper[4992]: E0131 09:44:23.602638 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35185358-7b72-425e-af0b-c52b7887ce93" containerName="mariadb-database-create" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602657 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="35185358-7b72-425e-af0b-c52b7887ce93" containerName="mariadb-database-create" Jan 31 09:44:23 crc kubenswrapper[4992]: E0131 09:44:23.602669 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac29a1f-926d-44c3-b380-4f48340ad9ce" containerName="keystone-db-sync" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602675 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac29a1f-926d-44c3-b380-4f48340ad9ce" containerName="keystone-db-sync" Jan 31 09:44:23 crc kubenswrapper[4992]: E0131 09:44:23.602689 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80b2ba9-24f2-44ec-a523-74a843ee40dd" containerName="mariadb-database-create" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602695 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80b2ba9-24f2-44ec-a523-74a843ee40dd" containerName="mariadb-database-create" Jan 31 09:44:23 crc kubenswrapper[4992]: E0131 09:44:23.602706 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b91424-3380-42f6-a2a3-edcb31b2eee2" containerName="init" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602711 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b91424-3380-42f6-a2a3-edcb31b2eee2" containerName="init" Jan 31 09:44:23 crc kubenswrapper[4992]: E0131 09:44:23.602721 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d4c3a5c-ab08-4b94-877e-73e2641429d4" containerName="mariadb-account-create-update" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602727 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d4c3a5c-ab08-4b94-877e-73e2641429d4" containerName="mariadb-account-create-update" Jan 31 09:44:23 crc kubenswrapper[4992]: E0131 09:44:23.602743 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab96b3c5-39bc-40ae-a1eb-2a751e90c944" containerName="mariadb-account-create-update" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602748 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab96b3c5-39bc-40ae-a1eb-2a751e90c944" containerName="mariadb-account-create-update" Jan 31 09:44:23 crc kubenswrapper[4992]: E0131 09:44:23.602761 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d23756fe-5e0d-43f4-a977-a9058b096998" containerName="mariadb-account-create-update" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602766 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23756fe-5e0d-43f4-a977-a9058b096998" containerName="mariadb-account-create-update" Jan 31 09:44:23 crc kubenswrapper[4992]: E0131 09:44:23.602776 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690c5d16-8767-4215-adcc-6c52a3f214f9" containerName="mariadb-database-create" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602782 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="690c5d16-8767-4215-adcc-6c52a3f214f9" containerName="mariadb-database-create" Jan 31 09:44:23 crc kubenswrapper[4992]: E0131 09:44:23.602790 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b91424-3380-42f6-a2a3-edcb31b2eee2" containerName="dnsmasq-dns" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602796 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b91424-3380-42f6-a2a3-edcb31b2eee2" containerName="dnsmasq-dns" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602921 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="690c5d16-8767-4215-adcc-6c52a3f214f9" containerName="mariadb-database-create" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602931 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b91424-3380-42f6-a2a3-edcb31b2eee2" containerName="dnsmasq-dns" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602943 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d23756fe-5e0d-43f4-a977-a9058b096998" containerName="mariadb-account-create-update" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602951 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac29a1f-926d-44c3-b380-4f48340ad9ce" containerName="keystone-db-sync" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602961 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="f80b2ba9-24f2-44ec-a523-74a843ee40dd" containerName="mariadb-database-create" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602972 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d4c3a5c-ab08-4b94-877e-73e2641429d4" containerName="mariadb-account-create-update" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602981 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="35185358-7b72-425e-af0b-c52b7887ce93" containerName="mariadb-database-create" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.602991 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab96b3c5-39bc-40ae-a1eb-2a751e90c944" containerName="mariadb-account-create-update" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.603815 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.634070 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-wbr5n"] Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.676492 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2tg6f"] Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.677780 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.679926 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.680134 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gkf5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.682695 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.682859 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.682903 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.701547 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-combined-ca-bundle\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.701815 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-credential-keys\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.701933 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-config\") pod \"dnsmasq-dns-67795cd9-wbr5n\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.702018 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-scripts\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.702087 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-config-data\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.702171 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gn8m\" (UniqueName: \"kubernetes.io/projected/10a31e95-2163-4d81-acf4-7e6eb7318df3-kube-api-access-9gn8m\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.702247 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-wbr5n\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.702334 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-dns-svc\") pod \"dnsmasq-dns-67795cd9-wbr5n\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.702452 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-fernet-keys\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.702532 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-wbr5n\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.702610 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkqtx\" (UniqueName: \"kubernetes.io/projected/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-kube-api-access-qkqtx\") pod \"dnsmasq-dns-67795cd9-wbr5n\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.711242 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2tg6f"] Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.803644 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-dns-svc\") pod \"dnsmasq-dns-67795cd9-wbr5n\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.803725 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-fernet-keys\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.803744 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-wbr5n\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.803766 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkqtx\" (UniqueName: \"kubernetes.io/projected/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-kube-api-access-qkqtx\") pod \"dnsmasq-dns-67795cd9-wbr5n\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.803809 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-combined-ca-bundle\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.803835 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-credential-keys\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.803856 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-config\") pod \"dnsmasq-dns-67795cd9-wbr5n\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.803874 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-scripts\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.803891 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-config-data\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.803915 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gn8m\" (UniqueName: \"kubernetes.io/projected/10a31e95-2163-4d81-acf4-7e6eb7318df3-kube-api-access-9gn8m\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.803936 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-wbr5n\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.805569 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-ovsdbserver-sb\") pod \"dnsmasq-dns-67795cd9-wbr5n\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.805655 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-dns-svc\") pod \"dnsmasq-dns-67795cd9-wbr5n\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.807165 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-config\") pod \"dnsmasq-dns-67795cd9-wbr5n\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.807779 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-ovsdbserver-nb\") pod \"dnsmasq-dns-67795cd9-wbr5n\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.818911 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-config-data\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.818911 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-fernet-keys\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.829801 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-combined-ca-bundle\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.834697 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-scripts\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.842112 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkqtx\" (UniqueName: \"kubernetes.io/projected/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-kube-api-access-qkqtx\") pod \"dnsmasq-dns-67795cd9-wbr5n\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.847018 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gn8m\" (UniqueName: \"kubernetes.io/projected/10a31e95-2163-4d81-acf4-7e6eb7318df3-kube-api-access-9gn8m\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.853092 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-credential-keys\") pod \"keystone-bootstrap-2tg6f\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.865108 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-68c4b8b7d5-sdxvb"] Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.873172 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.879709 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-422vh" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.880004 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.880286 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.880011 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.904163 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68c4b8b7d5-sdxvb"] Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.904845 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df170e16-84e4-4d9f-b289-11cabc2da983-config-data\") pod \"horizon-68c4b8b7d5-sdxvb\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.904993 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p9ds\" (UniqueName: \"kubernetes.io/projected/df170e16-84e4-4d9f-b289-11cabc2da983-kube-api-access-9p9ds\") pod \"horizon-68c4b8b7d5-sdxvb\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.905083 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df170e16-84e4-4d9f-b289-11cabc2da983-logs\") pod \"horizon-68c4b8b7d5-sdxvb\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.905178 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/df170e16-84e4-4d9f-b289-11cabc2da983-horizon-secret-key\") pod \"horizon-68c4b8b7d5-sdxvb\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.905274 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df170e16-84e4-4d9f-b289-11cabc2da983-scripts\") pod \"horizon-68c4b8b7d5-sdxvb\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.936533 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:23 crc kubenswrapper[4992]: I0131 09:44:23.997940 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.009022 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p9ds\" (UniqueName: \"kubernetes.io/projected/df170e16-84e4-4d9f-b289-11cabc2da983-kube-api-access-9p9ds\") pod \"horizon-68c4b8b7d5-sdxvb\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.009063 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df170e16-84e4-4d9f-b289-11cabc2da983-logs\") pod \"horizon-68c4b8b7d5-sdxvb\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.009097 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/df170e16-84e4-4d9f-b289-11cabc2da983-horizon-secret-key\") pod \"horizon-68c4b8b7d5-sdxvb\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.009122 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df170e16-84e4-4d9f-b289-11cabc2da983-scripts\") pod \"horizon-68c4b8b7d5-sdxvb\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.009190 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df170e16-84e4-4d9f-b289-11cabc2da983-config-data\") pod \"horizon-68c4b8b7d5-sdxvb\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.010583 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df170e16-84e4-4d9f-b289-11cabc2da983-config-data\") pod \"horizon-68c4b8b7d5-sdxvb\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.010854 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9dxlh"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.010980 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df170e16-84e4-4d9f-b289-11cabc2da983-scripts\") pod \"horizon-68c4b8b7d5-sdxvb\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.011091 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df170e16-84e4-4d9f-b289-11cabc2da983-logs\") pod \"horizon-68c4b8b7d5-sdxvb\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.011733 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.018065 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-h4h6m" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.036954 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.037587 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/df170e16-84e4-4d9f-b289-11cabc2da983-horizon-secret-key\") pod \"horizon-68c4b8b7d5-sdxvb\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.042272 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.042478 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-4mh4k"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.043497 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4mh4k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.057282 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9dxlh"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.065165 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.068267 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-9v55r" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.068707 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.097142 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4mh4k"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.102993 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p9ds\" (UniqueName: \"kubernetes.io/projected/df170e16-84e4-4d9f-b289-11cabc2da983-kube-api-access-9p9ds\") pod \"horizon-68c4b8b7d5-sdxvb\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.120329 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57f531e7-e05e-4537-bb22-01911330abd2-etc-machine-id\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.120380 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-db-sync-config-data\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.120427 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9n9m\" (UniqueName: \"kubernetes.io/projected/cf4a75ee-0abf-46e9-ac05-14641a2fd782-kube-api-access-b9n9m\") pod \"neutron-db-sync-4mh4k\" (UID: \"cf4a75ee-0abf-46e9-ac05-14641a2fd782\") " pod="openstack/neutron-db-sync-4mh4k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.120444 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf4a75ee-0abf-46e9-ac05-14641a2fd782-config\") pod \"neutron-db-sync-4mh4k\" (UID: \"cf4a75ee-0abf-46e9-ac05-14641a2fd782\") " pod="openstack/neutron-db-sync-4mh4k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.120501 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-scripts\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.120533 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4a75ee-0abf-46e9-ac05-14641a2fd782-combined-ca-bundle\") pod \"neutron-db-sync-4mh4k\" (UID: \"cf4a75ee-0abf-46e9-ac05-14641a2fd782\") " pod="openstack/neutron-db-sync-4mh4k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.120570 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-combined-ca-bundle\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.120637 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-config-data\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.120651 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhwkv\" (UniqueName: \"kubernetes.io/projected/57f531e7-e05e-4537-bb22-01911330abd2-kube-api-access-bhwkv\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.218046 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55c95bb65-ttg8k"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.219440 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.222802 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.222963 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e974b81f-d9b4-4209-b9a6-72b22eec93e4-horizon-secret-key\") pod \"horizon-55c95bb65-ttg8k\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.223013 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e974b81f-d9b4-4209-b9a6-72b22eec93e4-logs\") pod \"horizon-55c95bb65-ttg8k\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.223032 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-db-sync-config-data\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.223053 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9n9m\" (UniqueName: \"kubernetes.io/projected/cf4a75ee-0abf-46e9-ac05-14641a2fd782-kube-api-access-b9n9m\") pod \"neutron-db-sync-4mh4k\" (UID: \"cf4a75ee-0abf-46e9-ac05-14641a2fd782\") " pod="openstack/neutron-db-sync-4mh4k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.223077 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf4a75ee-0abf-46e9-ac05-14641a2fd782-config\") pod \"neutron-db-sync-4mh4k\" (UID: \"cf4a75ee-0abf-46e9-ac05-14641a2fd782\") " pod="openstack/neutron-db-sync-4mh4k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.223097 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e974b81f-d9b4-4209-b9a6-72b22eec93e4-scripts\") pod \"horizon-55c95bb65-ttg8k\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.223113 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl5wh\" (UniqueName: \"kubernetes.io/projected/e974b81f-d9b4-4209-b9a6-72b22eec93e4-kube-api-access-rl5wh\") pod \"horizon-55c95bb65-ttg8k\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.223149 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-scripts\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.223172 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4a75ee-0abf-46e9-ac05-14641a2fd782-combined-ca-bundle\") pod \"neutron-db-sync-4mh4k\" (UID: \"cf4a75ee-0abf-46e9-ac05-14641a2fd782\") " pod="openstack/neutron-db-sync-4mh4k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.223199 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-combined-ca-bundle\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.223244 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-config-data\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.223258 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhwkv\" (UniqueName: \"kubernetes.io/projected/57f531e7-e05e-4537-bb22-01911330abd2-kube-api-access-bhwkv\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.223292 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e974b81f-d9b4-4209-b9a6-72b22eec93e4-config-data\") pod \"horizon-55c95bb65-ttg8k\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.223312 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57f531e7-e05e-4537-bb22-01911330abd2-etc-machine-id\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.223372 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57f531e7-e05e-4537-bb22-01911330abd2-etc-machine-id\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.248146 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-scripts\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.253946 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-db-sync-config-data\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.254947 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-combined-ca-bundle\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.255211 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-config-data\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.262202 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf4a75ee-0abf-46e9-ac05-14641a2fd782-config\") pod \"neutron-db-sync-4mh4k\" (UID: \"cf4a75ee-0abf-46e9-ac05-14641a2fd782\") " pod="openstack/neutron-db-sync-4mh4k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.265999 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4a75ee-0abf-46e9-ac05-14641a2fd782-combined-ca-bundle\") pod \"neutron-db-sync-4mh4k\" (UID: \"cf4a75ee-0abf-46e9-ac05-14641a2fd782\") " pod="openstack/neutron-db-sync-4mh4k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.267411 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-lbgq7"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.268387 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lbgq7" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.275131 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.275297 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-brjft" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.281701 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9n9m\" (UniqueName: \"kubernetes.io/projected/cf4a75ee-0abf-46e9-ac05-14641a2fd782-kube-api-access-b9n9m\") pod \"neutron-db-sync-4mh4k\" (UID: \"cf4a75ee-0abf-46e9-ac05-14641a2fd782\") " pod="openstack/neutron-db-sync-4mh4k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.295986 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhwkv\" (UniqueName: \"kubernetes.io/projected/57f531e7-e05e-4537-bb22-01911330abd2-kube-api-access-bhwkv\") pod \"cinder-db-sync-9dxlh\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.315499 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lbgq7"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.327058 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e974b81f-d9b4-4209-b9a6-72b22eec93e4-config-data\") pod \"horizon-55c95bb65-ttg8k\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.327111 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e974b81f-d9b4-4209-b9a6-72b22eec93e4-horizon-secret-key\") pod \"horizon-55c95bb65-ttg8k\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.327154 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e974b81f-d9b4-4209-b9a6-72b22eec93e4-logs\") pod \"horizon-55c95bb65-ttg8k\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.327216 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e974b81f-d9b4-4209-b9a6-72b22eec93e4-scripts\") pod \"horizon-55c95bb65-ttg8k\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.327237 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl5wh\" (UniqueName: \"kubernetes.io/projected/e974b81f-d9b4-4209-b9a6-72b22eec93e4-kube-api-access-rl5wh\") pod \"horizon-55c95bb65-ttg8k\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.327665 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e974b81f-d9b4-4209-b9a6-72b22eec93e4-logs\") pod \"horizon-55c95bb65-ttg8k\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.327929 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e974b81f-d9b4-4209-b9a6-72b22eec93e4-scripts\") pod \"horizon-55c95bb65-ttg8k\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.329475 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e974b81f-d9b4-4209-b9a6-72b22eec93e4-config-data\") pod \"horizon-55c95bb65-ttg8k\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.335708 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-wbr5n"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.340522 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e974b81f-d9b4-4209-b9a6-72b22eec93e4-horizon-secret-key\") pod \"horizon-55c95bb65-ttg8k\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.374373 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55c95bb65-ttg8k"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.395148 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl5wh\" (UniqueName: \"kubernetes.io/projected/e974b81f-d9b4-4209-b9a6-72b22eec93e4-kube-api-access-rl5wh\") pod \"horizon-55c95bb65-ttg8k\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.409067 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-mbgzl"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.410155 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.413943 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-p8h77" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.414101 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.414155 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.428314 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/02f6c85a-822e-4864-b0aa-1c487d73721c-db-sync-config-data\") pod \"barbican-db-sync-lbgq7\" (UID: \"02f6c85a-822e-4864-b0aa-1c487d73721c\") " pod="openstack/barbican-db-sync-lbgq7" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.428382 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f6c85a-822e-4864-b0aa-1c487d73721c-combined-ca-bundle\") pod \"barbican-db-sync-lbgq7\" (UID: \"02f6c85a-822e-4864-b0aa-1c487d73721c\") " pod="openstack/barbican-db-sync-lbgq7" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.428457 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwwzk\" (UniqueName: \"kubernetes.io/projected/02f6c85a-822e-4864-b0aa-1c487d73721c-kube-api-access-bwwzk\") pod \"barbican-db-sync-lbgq7\" (UID: \"02f6c85a-822e-4864-b0aa-1c487d73721c\") " pod="openstack/barbican-db-sync-lbgq7" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.440254 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.446364 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mbgzl"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.455983 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-n2kll"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.459836 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.493964 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4mh4k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.530883 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/02f6c85a-822e-4864-b0aa-1c487d73721c-db-sync-config-data\") pod \"barbican-db-sync-lbgq7\" (UID: \"02f6c85a-822e-4864-b0aa-1c487d73721c\") " pod="openstack/barbican-db-sync-lbgq7" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.531615 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-scripts\") pod \"placement-db-sync-mbgzl\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.531759 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f6c85a-822e-4864-b0aa-1c487d73721c-combined-ca-bundle\") pod \"barbican-db-sync-lbgq7\" (UID: \"02f6c85a-822e-4864-b0aa-1c487d73721c\") " pod="openstack/barbican-db-sync-lbgq7" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.531949 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-combined-ca-bundle\") pod \"placement-db-sync-mbgzl\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.532157 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-config-data\") pod \"placement-db-sync-mbgzl\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.532311 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-logs\") pod \"placement-db-sync-mbgzl\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.532470 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwwzk\" (UniqueName: \"kubernetes.io/projected/02f6c85a-822e-4864-b0aa-1c487d73721c-kube-api-access-bwwzk\") pod \"barbican-db-sync-lbgq7\" (UID: \"02f6c85a-822e-4864-b0aa-1c487d73721c\") " pod="openstack/barbican-db-sync-lbgq7" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.532618 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s9mg\" (UniqueName: \"kubernetes.io/projected/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-kube-api-access-4s9mg\") pod \"placement-db-sync-mbgzl\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.534724 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-n2kll"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.536445 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f6c85a-822e-4864-b0aa-1c487d73721c-combined-ca-bundle\") pod \"barbican-db-sync-lbgq7\" (UID: \"02f6c85a-822e-4864-b0aa-1c487d73721c\") " pod="openstack/barbican-db-sync-lbgq7" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.551168 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/02f6c85a-822e-4864-b0aa-1c487d73721c-db-sync-config-data\") pod \"barbican-db-sync-lbgq7\" (UID: \"02f6c85a-822e-4864-b0aa-1c487d73721c\") " pod="openstack/barbican-db-sync-lbgq7" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.551329 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.556428 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.563120 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.569883 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.576141 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.579614 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwwzk\" (UniqueName: \"kubernetes.io/projected/02f6c85a-822e-4864-b0aa-1c487d73721c-kube-api-access-bwwzk\") pod \"barbican-db-sync-lbgq7\" (UID: \"02f6c85a-822e-4864-b0aa-1c487d73721c\") " pod="openstack/barbican-db-sync-lbgq7" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.580703 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.640848 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-config\") pod \"dnsmasq-dns-5b6dbdb6f5-n2kll\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.640929 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skw88\" (UniqueName: \"kubernetes.io/projected/c0b24a72-cd6e-425a-ac8f-f810990eb8df-kube-api-access-skw88\") pod \"dnsmasq-dns-5b6dbdb6f5-n2kll\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.640988 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-n2kll\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.641032 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-scripts\") pod \"placement-db-sync-mbgzl\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.641119 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-n2kll\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.641202 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-combined-ca-bundle\") pod \"placement-db-sync-mbgzl\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.641281 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-config-data\") pod \"placement-db-sync-mbgzl\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.641334 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-logs\") pod \"placement-db-sync-mbgzl\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.641369 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s9mg\" (UniqueName: \"kubernetes.io/projected/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-kube-api-access-4s9mg\") pod \"placement-db-sync-mbgzl\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.641432 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-n2kll\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.642161 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-logs\") pod \"placement-db-sync-mbgzl\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.642864 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lbgq7" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.646130 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-config-data\") pod \"placement-db-sync-mbgzl\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.652164 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-combined-ca-bundle\") pod \"placement-db-sync-mbgzl\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.655523 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-scripts\") pod \"placement-db-sync-mbgzl\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.674530 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s9mg\" (UniqueName: \"kubernetes.io/projected/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-kube-api-access-4s9mg\") pod \"placement-db-sync-mbgzl\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.743305 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skw88\" (UniqueName: \"kubernetes.io/projected/c0b24a72-cd6e-425a-ac8f-f810990eb8df-kube-api-access-skw88\") pod \"dnsmasq-dns-5b6dbdb6f5-n2kll\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.743556 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014742b0-d197-4f2d-9186-3fb1daa4318e-run-httpd\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.743641 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drjzg\" (UniqueName: \"kubernetes.io/projected/014742b0-d197-4f2d-9186-3fb1daa4318e-kube-api-access-drjzg\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.743714 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-n2kll\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.743803 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-n2kll\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.743903 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014742b0-d197-4f2d-9186-3fb1daa4318e-log-httpd\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.744107 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-n2kll\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.744230 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-config-data\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.744314 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-scripts\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.744378 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.744521 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.744611 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-config\") pod \"dnsmasq-dns-5b6dbdb6f5-n2kll\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.745517 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-config\") pod \"dnsmasq-dns-5b6dbdb6f5-n2kll\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.745747 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-n2kll\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.745750 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-n2kll\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.746633 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-n2kll\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.758823 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mbgzl" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.767372 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skw88\" (UniqueName: \"kubernetes.io/projected/c0b24a72-cd6e-425a-ac8f-f810990eb8df-kube-api-access-skw88\") pod \"dnsmasq-dns-5b6dbdb6f5-n2kll\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.807181 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.810597 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-wbr5n"] Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.849451 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014742b0-d197-4f2d-9186-3fb1daa4318e-log-httpd\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.849501 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-config-data\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.849532 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-scripts\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.849549 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.849577 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.849610 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014742b0-d197-4f2d-9186-3fb1daa4318e-run-httpd\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.849633 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drjzg\" (UniqueName: \"kubernetes.io/projected/014742b0-d197-4f2d-9186-3fb1daa4318e-kube-api-access-drjzg\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.850301 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014742b0-d197-4f2d-9186-3fb1daa4318e-log-httpd\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.852467 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014742b0-d197-4f2d-9186-3fb1daa4318e-run-httpd\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.858792 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-config-data\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.859776 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.859952 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.861582 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-scripts\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.877057 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drjzg\" (UniqueName: \"kubernetes.io/projected/014742b0-d197-4f2d-9186-3fb1daa4318e-kube-api-access-drjzg\") pod \"ceilometer-0\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.903434 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:44:24 crc kubenswrapper[4992]: I0131 09:44:24.910240 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2tg6f"] Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.014076 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68c4b8b7d5-sdxvb"] Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.127633 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4mh4k"] Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.138746 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9dxlh"] Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.308059 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55c95bb65-ttg8k"] Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.422492 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mbgzl"] Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.436328 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lbgq7"] Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.452120 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55c95bb65-ttg8k" event={"ID":"e974b81f-d9b4-4209-b9a6-72b22eec93e4","Type":"ContainerStarted","Data":"5e3d3e027863c24a76c1b9d383c7288b771f3096e665c06c91d334696d2d0ff9"} Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.453193 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mbgzl" event={"ID":"db1e7454-fec7-4ec7-a2e2-5e4ebb145213","Type":"ContainerStarted","Data":"368e6cc17c9da448460aa7beb099f3d3acf28c3ebba95fcb99501df2b70a0b7c"} Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.455404 4992 generic.go:334] "Generic (PLEG): container finished" podID="aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17" containerID="ba32ca1b0aefd9d0e35c548826b6ca44eded8f9c090b0c63ef222b6b60f06d18" exitCode=0 Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.455484 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-wbr5n" event={"ID":"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17","Type":"ContainerDied","Data":"ba32ca1b0aefd9d0e35c548826b6ca44eded8f9c090b0c63ef222b6b60f06d18"} Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.455508 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-wbr5n" event={"ID":"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17","Type":"ContainerStarted","Data":"c25c5276dda398add99fb66c3acefd888ce50dd162ad0130a77252287f1fb282"} Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.457699 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68c4b8b7d5-sdxvb" event={"ID":"df170e16-84e4-4d9f-b289-11cabc2da983","Type":"ContainerStarted","Data":"cf5368bb00170a9a99b17934e67f70ced3326db5295965e7d0fc4d5c9a77d04a"} Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.463658 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9dxlh" event={"ID":"57f531e7-e05e-4537-bb22-01911330abd2","Type":"ContainerStarted","Data":"637a267f6f098ba2530c0424a5d8fbb123eb5ed1503b83373f6efad7fe174ca9"} Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.465044 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lbgq7" event={"ID":"02f6c85a-822e-4864-b0aa-1c487d73721c","Type":"ContainerStarted","Data":"9b0e8b374ac3cc33c2c554d05fcc1e77d073585e970502c09a8ad9baff93c314"} Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.466582 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2tg6f" event={"ID":"10a31e95-2163-4d81-acf4-7e6eb7318df3","Type":"ContainerStarted","Data":"064ddcb62dd845716362c318734fec5d6c84d2b568a60bdf0ea5cb9f3c22b2f4"} Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.466604 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2tg6f" event={"ID":"10a31e95-2163-4d81-acf4-7e6eb7318df3","Type":"ContainerStarted","Data":"7f621ae36cd590f74e793df65598f64504c440c387e332605a0303f7869b8a00"} Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.471683 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4mh4k" event={"ID":"cf4a75ee-0abf-46e9-ac05-14641a2fd782","Type":"ContainerStarted","Data":"69e839170a296d37a2791c1c4b3a5531525a1d55c14766d4f8f8785969eeef4f"} Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.471721 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4mh4k" event={"ID":"cf4a75ee-0abf-46e9-ac05-14641a2fd782","Type":"ContainerStarted","Data":"7aea70eaaffcd5192a2452ea9d2866fc67080c8de7711aa0ef050d640a3c3cdf"} Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.504323 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-4mh4k" podStartSLOduration=2.504304945 podStartE2EDuration="2.504304945s" podCreationTimestamp="2026-01-31 09:44:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:44:25.496771028 +0000 UTC m=+1161.468163035" watchObservedRunningTime="2026-01-31 09:44:25.504304945 +0000 UTC m=+1161.475696932" Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.528375 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2tg6f" podStartSLOduration=2.528351358 podStartE2EDuration="2.528351358s" podCreationTimestamp="2026-01-31 09:44:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:44:25.525130265 +0000 UTC m=+1161.496522272" watchObservedRunningTime="2026-01-31 09:44:25.528351358 +0000 UTC m=+1161.499743345" Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.601705 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-n2kll"] Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.608328 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.810694 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68c4b8b7d5-sdxvb"] Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.938265 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-96867c577-xw6dh"] Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.941525 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.946192 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.960828 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-96867c577-xw6dh"] Jan 31 09:44:25 crc kubenswrapper[4992]: I0131 09:44:25.972631 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.105359 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-dns-svc\") pod \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.105452 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkqtx\" (UniqueName: \"kubernetes.io/projected/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-kube-api-access-qkqtx\") pod \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.105482 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-config\") pod \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.105612 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-ovsdbserver-sb\") pod \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.105649 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-ovsdbserver-nb\") pod \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\" (UID: \"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17\") " Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.106383 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-config-data\") pod \"horizon-96867c577-xw6dh\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.110206 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-scripts\") pod \"horizon-96867c577-xw6dh\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.110251 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-logs\") pod \"horizon-96867c577-xw6dh\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.110497 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sbxc\" (UniqueName: \"kubernetes.io/projected/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-kube-api-access-4sbxc\") pod \"horizon-96867c577-xw6dh\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.110563 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-horizon-secret-key\") pod \"horizon-96867c577-xw6dh\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.112069 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-kube-api-access-qkqtx" (OuterVolumeSpecName: "kube-api-access-qkqtx") pod "aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17" (UID: "aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17"). InnerVolumeSpecName "kube-api-access-qkqtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.140088 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-config" (OuterVolumeSpecName: "config") pod "aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17" (UID: "aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.143130 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17" (UID: "aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.146217 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17" (UID: "aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.149924 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17" (UID: "aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.212341 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-horizon-secret-key\") pod \"horizon-96867c577-xw6dh\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.212494 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-config-data\") pod \"horizon-96867c577-xw6dh\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.212537 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-scripts\") pod \"horizon-96867c577-xw6dh\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.212569 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-logs\") pod \"horizon-96867c577-xw6dh\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.212627 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sbxc\" (UniqueName: \"kubernetes.io/projected/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-kube-api-access-4sbxc\") pod \"horizon-96867c577-xw6dh\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.212674 4992 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.212684 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkqtx\" (UniqueName: \"kubernetes.io/projected/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-kube-api-access-qkqtx\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.212693 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.212702 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.212709 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.214749 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-logs\") pod \"horizon-96867c577-xw6dh\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.215491 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-config-data\") pod \"horizon-96867c577-xw6dh\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.215554 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-scripts\") pod \"horizon-96867c577-xw6dh\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.216195 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-horizon-secret-key\") pod \"horizon-96867c577-xw6dh\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.229177 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sbxc\" (UniqueName: \"kubernetes.io/projected/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-kube-api-access-4sbxc\") pod \"horizon-96867c577-xw6dh\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.270885 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.511606 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"014742b0-d197-4f2d-9186-3fb1daa4318e","Type":"ContainerStarted","Data":"529a2886592501b3de60efe5bb2a15a6d2b1c9a94f5e1d29c9f56686685559c6"} Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.529622 4992 generic.go:334] "Generic (PLEG): container finished" podID="c0b24a72-cd6e-425a-ac8f-f810990eb8df" containerID="83a051c833ae7061f437c780e99f141c06381328067fbd1215387053df7ec005" exitCode=0 Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.529699 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" event={"ID":"c0b24a72-cd6e-425a-ac8f-f810990eb8df","Type":"ContainerDied","Data":"83a051c833ae7061f437c780e99f141c06381328067fbd1215387053df7ec005"} Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.529724 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" event={"ID":"c0b24a72-cd6e-425a-ac8f-f810990eb8df","Type":"ContainerStarted","Data":"cf65f5e840bc599af4190cfecba0374a819c44a97fc66163cbd9b19652397b55"} Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.536540 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67795cd9-wbr5n" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.537525 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67795cd9-wbr5n" event={"ID":"aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17","Type":"ContainerDied","Data":"c25c5276dda398add99fb66c3acefd888ce50dd162ad0130a77252287f1fb282"} Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.537588 4992 scope.go:117] "RemoveContainer" containerID="ba32ca1b0aefd9d0e35c548826b6ca44eded8f9c090b0c63ef222b6b60f06d18" Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.639002 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-wbr5n"] Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.649359 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67795cd9-wbr5n"] Jan 31 09:44:26 crc kubenswrapper[4992]: I0131 09:44:26.862734 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-96867c577-xw6dh"] Jan 31 09:44:27 crc kubenswrapper[4992]: I0131 09:44:27.198056 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17" path="/var/lib/kubelet/pods/aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17/volumes" Jan 31 09:44:27 crc kubenswrapper[4992]: I0131 09:44:27.547840 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-96867c577-xw6dh" event={"ID":"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca","Type":"ContainerStarted","Data":"586b8b7201f643cb621a0b6bc1d70e990a8b182a51dca43a35345c3ac66281a7"} Jan 31 09:44:27 crc kubenswrapper[4992]: I0131 09:44:27.551261 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" event={"ID":"c0b24a72-cd6e-425a-ac8f-f810990eb8df","Type":"ContainerStarted","Data":"c1bf1f06d94c1a57ef50cf1a6f46f96d135d9ed79f0f3b54ba6260b166312c71"} Jan 31 09:44:27 crc kubenswrapper[4992]: I0131 09:44:27.551717 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:27 crc kubenswrapper[4992]: I0131 09:44:27.572397 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" podStartSLOduration=3.5723607 podStartE2EDuration="3.5723607s" podCreationTimestamp="2026-01-31 09:44:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:44:27.568763196 +0000 UTC m=+1163.540155203" watchObservedRunningTime="2026-01-31 09:44:27.5723607 +0000 UTC m=+1163.543752687" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.444895 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55c95bb65-ttg8k"] Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.492764 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-65f6bf6f54-x2b8z"] Jan 31 09:44:32 crc kubenswrapper[4992]: E0131 09:44:32.493203 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17" containerName="init" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.493227 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17" containerName="init" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.493813 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5b05c9-1e92-4c5a-8d41-3f7ec8f2fd17" containerName="init" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.494848 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.498352 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.520564 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65f6bf6f54-x2b8z"] Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.555258 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04ff2a8b-a743-475e-9ae5-5fb98839ba57-logs\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.555318 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmlxg\" (UniqueName: \"kubernetes.io/projected/04ff2a8b-a743-475e-9ae5-5fb98839ba57-kube-api-access-gmlxg\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.555375 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04ff2a8b-a743-475e-9ae5-5fb98839ba57-config-data\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.555397 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04ff2a8b-a743-475e-9ae5-5fb98839ba57-scripts\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.555487 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-horizon-tls-certs\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.556406 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-horizon-secret-key\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.556480 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-combined-ca-bundle\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.567829 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-96867c577-xw6dh"] Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.624952 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d6fc5dc84-n2kln"] Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.626243 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.666551 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04ff2a8b-a743-475e-9ae5-5fb98839ba57-logs\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.666876 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmlxg\" (UniqueName: \"kubernetes.io/projected/04ff2a8b-a743-475e-9ae5-5fb98839ba57-kube-api-access-gmlxg\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.667054 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04ff2a8b-a743-475e-9ae5-5fb98839ba57-config-data\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.667171 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04ff2a8b-a743-475e-9ae5-5fb98839ba57-scripts\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.667437 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-horizon-tls-certs\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.672270 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04ff2a8b-a743-475e-9ae5-5fb98839ba57-config-data\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.667467 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04ff2a8b-a743-475e-9ae5-5fb98839ba57-logs\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.672950 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-horizon-secret-key\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.674830 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-combined-ca-bundle\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.676673 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04ff2a8b-a743-475e-9ae5-5fb98839ba57-scripts\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.679040 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d6fc5dc84-n2kln"] Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.681787 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-combined-ca-bundle\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.692163 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-horizon-secret-key\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.694763 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmlxg\" (UniqueName: \"kubernetes.io/projected/04ff2a8b-a743-475e-9ae5-5fb98839ba57-kube-api-access-gmlxg\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.707760 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-horizon-tls-certs\") pod \"horizon-65f6bf6f54-x2b8z\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.777572 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6llwb\" (UniqueName: \"kubernetes.io/projected/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-kube-api-access-6llwb\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.777627 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-combined-ca-bundle\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.777651 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-horizon-secret-key\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.777703 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-horizon-tls-certs\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.777749 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-logs\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.777783 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-config-data\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.777804 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-scripts\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.828312 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.879578 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6llwb\" (UniqueName: \"kubernetes.io/projected/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-kube-api-access-6llwb\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.879852 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-combined-ca-bundle\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.879886 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-horizon-secret-key\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.879940 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-horizon-tls-certs\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.879977 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-logs\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.880004 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-config-data\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.880022 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-scripts\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.880929 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-scripts\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.881700 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-logs\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.882604 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-config-data\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.887556 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-combined-ca-bundle\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.898267 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-horizon-secret-key\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.898901 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-horizon-tls-certs\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.899456 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6llwb\" (UniqueName: \"kubernetes.io/projected/7ec5b46d-f009-46c5-a8a5-78b5b3afc50e-kube-api-access-6llwb\") pod \"horizon-5d6fc5dc84-n2kln\" (UID: \"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e\") " pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:32 crc kubenswrapper[4992]: I0131 09:44:32.956065 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:44:33 crc kubenswrapper[4992]: I0131 09:44:33.634430 4992 generic.go:334] "Generic (PLEG): container finished" podID="10a31e95-2163-4d81-acf4-7e6eb7318df3" containerID="064ddcb62dd845716362c318734fec5d6c84d2b568a60bdf0ea5cb9f3c22b2f4" exitCode=0 Jan 31 09:44:33 crc kubenswrapper[4992]: I0131 09:44:33.634648 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2tg6f" event={"ID":"10a31e95-2163-4d81-acf4-7e6eb7318df3","Type":"ContainerDied","Data":"064ddcb62dd845716362c318734fec5d6c84d2b568a60bdf0ea5cb9f3c22b2f4"} Jan 31 09:44:34 crc kubenswrapper[4992]: I0131 09:44:34.808606 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:44:34 crc kubenswrapper[4992]: I0131 09:44:34.882527 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-9nv8v"] Jan 31 09:44:34 crc kubenswrapper[4992]: I0131 09:44:34.882757 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" podUID="25e5d88a-2fe1-45fe-a262-bb5ef7742563" containerName="dnsmasq-dns" containerID="cri-o://8cda2b321cd22de41db6883e58f0e8a1afc3e092dd796236a9f3c64668bdea5f" gracePeriod=10 Jan 31 09:44:35 crc kubenswrapper[4992]: I0131 09:44:35.656604 4992 generic.go:334] "Generic (PLEG): container finished" podID="25e5d88a-2fe1-45fe-a262-bb5ef7742563" containerID="8cda2b321cd22de41db6883e58f0e8a1afc3e092dd796236a9f3c64668bdea5f" exitCode=0 Jan 31 09:44:35 crc kubenswrapper[4992]: I0131 09:44:35.656647 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" event={"ID":"25e5d88a-2fe1-45fe-a262-bb5ef7742563","Type":"ContainerDied","Data":"8cda2b321cd22de41db6883e58f0e8a1afc3e092dd796236a9f3c64668bdea5f"} Jan 31 09:44:35 crc kubenswrapper[4992]: I0131 09:44:35.920539 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" podUID="25e5d88a-2fe1-45fe-a262-bb5ef7742563" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Jan 31 09:44:39 crc kubenswrapper[4992]: E0131 09:44:39.256066 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Jan 31 09:44:39 crc kubenswrapper[4992]: E0131 09:44:39.256603 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nf5h57bhdch687hcch75hbfh75h5bdh589hffh587h649hd8h5fch7hcfh5ddhbdh99h8ch5d5h5f8h5dhb7h54h655h7bh545h54bh5ch64q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rl5wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-55c95bb65-ttg8k_openstack(e974b81f-d9b4-4209-b9a6-72b22eec93e4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 09:44:39 crc kubenswrapper[4992]: E0131 09:44:39.699030 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-55c95bb65-ttg8k" podUID="e974b81f-d9b4-4209-b9a6-72b22eec93e4" Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.787002 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.812127 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-credential-keys\") pod \"10a31e95-2163-4d81-acf4-7e6eb7318df3\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.812472 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-fernet-keys\") pod \"10a31e95-2163-4d81-acf4-7e6eb7318df3\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.812501 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-scripts\") pod \"10a31e95-2163-4d81-acf4-7e6eb7318df3\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.812644 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gn8m\" (UniqueName: \"kubernetes.io/projected/10a31e95-2163-4d81-acf4-7e6eb7318df3-kube-api-access-9gn8m\") pod \"10a31e95-2163-4d81-acf4-7e6eb7318df3\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.812741 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-config-data\") pod \"10a31e95-2163-4d81-acf4-7e6eb7318df3\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.812790 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-combined-ca-bundle\") pod \"10a31e95-2163-4d81-acf4-7e6eb7318df3\" (UID: \"10a31e95-2163-4d81-acf4-7e6eb7318df3\") " Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.817196 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-scripts" (OuterVolumeSpecName: "scripts") pod "10a31e95-2163-4d81-acf4-7e6eb7318df3" (UID: "10a31e95-2163-4d81-acf4-7e6eb7318df3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.819357 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "10a31e95-2163-4d81-acf4-7e6eb7318df3" (UID: "10a31e95-2163-4d81-acf4-7e6eb7318df3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.819567 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "10a31e95-2163-4d81-acf4-7e6eb7318df3" (UID: "10a31e95-2163-4d81-acf4-7e6eb7318df3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.820563 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10a31e95-2163-4d81-acf4-7e6eb7318df3-kube-api-access-9gn8m" (OuterVolumeSpecName: "kube-api-access-9gn8m") pod "10a31e95-2163-4d81-acf4-7e6eb7318df3" (UID: "10a31e95-2163-4d81-acf4-7e6eb7318df3"). InnerVolumeSpecName "kube-api-access-9gn8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.843998 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-config-data" (OuterVolumeSpecName: "config-data") pod "10a31e95-2163-4d81-acf4-7e6eb7318df3" (UID: "10a31e95-2163-4d81-acf4-7e6eb7318df3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.850120 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10a31e95-2163-4d81-acf4-7e6eb7318df3" (UID: "10a31e95-2163-4d81-acf4-7e6eb7318df3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.915027 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.915067 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.915077 4992 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.915086 4992 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.915095 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10a31e95-2163-4d81-acf4-7e6eb7318df3-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:39 crc kubenswrapper[4992]: I0131 09:44:39.915102 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gn8m\" (UniqueName: \"kubernetes.io/projected/10a31e95-2163-4d81-acf4-7e6eb7318df3-kube-api-access-9gn8m\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.700541 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2tg6f" event={"ID":"10a31e95-2163-4d81-acf4-7e6eb7318df3","Type":"ContainerDied","Data":"7f621ae36cd590f74e793df65598f64504c440c387e332605a0303f7869b8a00"} Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.700598 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f621ae36cd590f74e793df65598f64504c440c387e332605a0303f7869b8a00" Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.700568 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2tg6f" Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.876764 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2tg6f"] Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.883136 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2tg6f"] Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.920185 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" podUID="25e5d88a-2fe1-45fe-a262-bb5ef7742563" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.957958 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-52w86"] Jan 31 09:44:40 crc kubenswrapper[4992]: E0131 09:44:40.958328 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10a31e95-2163-4d81-acf4-7e6eb7318df3" containerName="keystone-bootstrap" Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.958345 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="10a31e95-2163-4d81-acf4-7e6eb7318df3" containerName="keystone-bootstrap" Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.958493 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="10a31e95-2163-4d81-acf4-7e6eb7318df3" containerName="keystone-bootstrap" Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.959250 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.968844 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gkf5n" Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.968965 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.968964 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.969304 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.969500 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 31 09:44:40 crc kubenswrapper[4992]: I0131 09:44:40.975971 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-52w86"] Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.141366 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-combined-ca-bundle\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.141789 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-fernet-keys\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.141893 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-scripts\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.141987 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-config-data\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.142159 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grlnd\" (UniqueName: \"kubernetes.io/projected/cf1be806-c02e-4606-94ea-438caf8ef9c6-kube-api-access-grlnd\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.142310 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-credential-keys\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.194838 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10a31e95-2163-4d81-acf4-7e6eb7318df3" path="/var/lib/kubelet/pods/10a31e95-2163-4d81-acf4-7e6eb7318df3/volumes" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.243550 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-fernet-keys\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.244088 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-scripts\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.244120 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-config-data\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.244219 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grlnd\" (UniqueName: \"kubernetes.io/projected/cf1be806-c02e-4606-94ea-438caf8ef9c6-kube-api-access-grlnd\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.244254 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-credential-keys\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.244300 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-combined-ca-bundle\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.250070 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-credential-keys\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.250494 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-config-data\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.251088 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-scripts\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.251690 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-combined-ca-bundle\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.251728 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-fernet-keys\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.263020 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grlnd\" (UniqueName: \"kubernetes.io/projected/cf1be806-c02e-4606-94ea-438caf8ef9c6-kube-api-access-grlnd\") pod \"keystone-bootstrap-52w86\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: I0131 09:44:41.281692 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-52w86" Jan 31 09:44:41 crc kubenswrapper[4992]: E0131 09:44:41.414949 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Jan 31 09:44:41 crc kubenswrapper[4992]: E0131 09:44:41.415597 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n67h5bchf6h64ch56ch65fhf9h9h5b9h594hf9h58fhf8h8bh5cdh56h97hdchbh6h677hcch559h67chfchf8h5fdhf9h66fh555h568h9cq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9p9ds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-68c4b8b7d5-sdxvb_openstack(df170e16-84e4-4d9f-b289-11cabc2da983): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 09:44:41 crc kubenswrapper[4992]: E0131 09:44:41.417702 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-68c4b8b7d5-sdxvb" podUID="df170e16-84e4-4d9f-b289-11cabc2da983" Jan 31 09:44:43 crc kubenswrapper[4992]: E0131 09:44:43.814460 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Jan 31 09:44:43 crc kubenswrapper[4992]: E0131 09:44:43.815151 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4s9mg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-mbgzl_openstack(db1e7454-fec7-4ec7-a2e2-5e4ebb145213): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 09:44:43 crc kubenswrapper[4992]: E0131 09:44:43.816550 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-mbgzl" podUID="db1e7454-fec7-4ec7-a2e2-5e4ebb145213" Jan 31 09:44:44 crc kubenswrapper[4992]: E0131 09:44:44.734734 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-mbgzl" podUID="db1e7454-fec7-4ec7-a2e2-5e4ebb145213" Jan 31 09:44:45 crc kubenswrapper[4992]: I0131 09:44:45.300847 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:44:45 crc kubenswrapper[4992]: I0131 09:44:45.300915 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:44:45 crc kubenswrapper[4992]: I0131 09:44:45.919951 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" podUID="25e5d88a-2fe1-45fe-a262-bb5ef7742563" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Jan 31 09:44:45 crc kubenswrapper[4992]: I0131 09:44:45.920305 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:47 crc kubenswrapper[4992]: I0131 09:44:47.756800 4992 generic.go:334] "Generic (PLEG): container finished" podID="cf4a75ee-0abf-46e9-ac05-14641a2fd782" containerID="69e839170a296d37a2791c1c4b3a5531525a1d55c14766d4f8f8785969eeef4f" exitCode=0 Jan 31 09:44:47 crc kubenswrapper[4992]: I0131 09:44:47.756974 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4mh4k" event={"ID":"cf4a75ee-0abf-46e9-ac05-14641a2fd782","Type":"ContainerDied","Data":"69e839170a296d37a2791c1c4b3a5531525a1d55c14766d4f8f8785969eeef4f"} Jan 31 09:44:50 crc kubenswrapper[4992]: I0131 09:44:50.920062 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" podUID="25e5d88a-2fe1-45fe-a262-bb5ef7742563" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Jan 31 09:44:52 crc kubenswrapper[4992]: E0131 09:44:52.192708 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Jan 31 09:44:52 crc kubenswrapper[4992]: E0131 09:44:52.193231 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cch586h6bh64fh96h55h689h7bh595h58h55bh666h59dh564h9fh58bh5b7h67bh68h66ch646h645h7ch5c9hb9h577h664h587h645h556hfch56cq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4sbxc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-96867c577-xw6dh_openstack(abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.254306 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:52 crc kubenswrapper[4992]: E0131 09:44:52.287579 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-96867c577-xw6dh" podUID="abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.449699 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e974b81f-d9b4-4209-b9a6-72b22eec93e4-config-data\") pod \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.450004 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e974b81f-d9b4-4209-b9a6-72b22eec93e4-scripts\") pod \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.450060 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e974b81f-d9b4-4209-b9a6-72b22eec93e4-logs\") pod \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.450109 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e974b81f-d9b4-4209-b9a6-72b22eec93e4-horizon-secret-key\") pod \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.450212 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl5wh\" (UniqueName: \"kubernetes.io/projected/e974b81f-d9b4-4209-b9a6-72b22eec93e4-kube-api-access-rl5wh\") pod \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\" (UID: \"e974b81f-d9b4-4209-b9a6-72b22eec93e4\") " Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.450385 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e974b81f-d9b4-4209-b9a6-72b22eec93e4-logs" (OuterVolumeSpecName: "logs") pod "e974b81f-d9b4-4209-b9a6-72b22eec93e4" (UID: "e974b81f-d9b4-4209-b9a6-72b22eec93e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.450696 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e974b81f-d9b4-4209-b9a6-72b22eec93e4-config-data" (OuterVolumeSpecName: "config-data") pod "e974b81f-d9b4-4209-b9a6-72b22eec93e4" (UID: "e974b81f-d9b4-4209-b9a6-72b22eec93e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.450980 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e974b81f-d9b4-4209-b9a6-72b22eec93e4-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.450999 4992 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e974b81f-d9b4-4209-b9a6-72b22eec93e4-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.451788 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e974b81f-d9b4-4209-b9a6-72b22eec93e4-scripts" (OuterVolumeSpecName: "scripts") pod "e974b81f-d9b4-4209-b9a6-72b22eec93e4" (UID: "e974b81f-d9b4-4209-b9a6-72b22eec93e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.455250 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e974b81f-d9b4-4209-b9a6-72b22eec93e4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e974b81f-d9b4-4209-b9a6-72b22eec93e4" (UID: "e974b81f-d9b4-4209-b9a6-72b22eec93e4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.455995 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e974b81f-d9b4-4209-b9a6-72b22eec93e4-kube-api-access-rl5wh" (OuterVolumeSpecName: "kube-api-access-rl5wh") pod "e974b81f-d9b4-4209-b9a6-72b22eec93e4" (UID: "e974b81f-d9b4-4209-b9a6-72b22eec93e4"). InnerVolumeSpecName "kube-api-access-rl5wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.552549 4992 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e974b81f-d9b4-4209-b9a6-72b22eec93e4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.552582 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl5wh\" (UniqueName: \"kubernetes.io/projected/e974b81f-d9b4-4209-b9a6-72b22eec93e4-kube-api-access-rl5wh\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.552593 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e974b81f-d9b4-4209-b9a6-72b22eec93e4-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:52 crc kubenswrapper[4992]: E0131 09:44:52.777914 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 31 09:44:52 crc kubenswrapper[4992]: E0131 09:44:52.778096 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bwwzk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-lbgq7_openstack(02f6c85a-822e-4864-b0aa-1c487d73721c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 09:44:52 crc kubenswrapper[4992]: E0131 09:44:52.779633 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-lbgq7" podUID="02f6c85a-822e-4864-b0aa-1c487d73721c" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.798203 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55c95bb65-ttg8k" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.798202 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55c95bb65-ttg8k" event={"ID":"e974b81f-d9b4-4209-b9a6-72b22eec93e4","Type":"ContainerDied","Data":"5e3d3e027863c24a76c1b9d383c7288b771f3096e665c06c91d334696d2d0ff9"} Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.800455 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68c4b8b7d5-sdxvb" event={"ID":"df170e16-84e4-4d9f-b289-11cabc2da983","Type":"ContainerDied","Data":"cf5368bb00170a9a99b17934e67f70ced3326db5295965e7d0fc4d5c9a77d04a"} Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.800498 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf5368bb00170a9a99b17934e67f70ced3326db5295965e7d0fc4d5c9a77d04a" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.802990 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4mh4k" event={"ID":"cf4a75ee-0abf-46e9-ac05-14641a2fd782","Type":"ContainerDied","Data":"7aea70eaaffcd5192a2452ea9d2866fc67080c8de7711aa0ef050d640a3c3cdf"} Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.803022 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aea70eaaffcd5192a2452ea9d2866fc67080c8de7711aa0ef050d640a3c3cdf" Jan 31 09:44:52 crc kubenswrapper[4992]: E0131 09:44:52.804803 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-lbgq7" podUID="02f6c85a-822e-4864-b0aa-1c487d73721c" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.854520 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.861615 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4mh4k" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.894543 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-55c95bb65-ttg8k"] Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.901897 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-55c95bb65-ttg8k"] Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.959643 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/df170e16-84e4-4d9f-b289-11cabc2da983-horizon-secret-key\") pod \"df170e16-84e4-4d9f-b289-11cabc2da983\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.959730 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df170e16-84e4-4d9f-b289-11cabc2da983-scripts\") pod \"df170e16-84e4-4d9f-b289-11cabc2da983\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.959805 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df170e16-84e4-4d9f-b289-11cabc2da983-logs\") pod \"df170e16-84e4-4d9f-b289-11cabc2da983\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.959861 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df170e16-84e4-4d9f-b289-11cabc2da983-config-data\") pod \"df170e16-84e4-4d9f-b289-11cabc2da983\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.959977 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p9ds\" (UniqueName: \"kubernetes.io/projected/df170e16-84e4-4d9f-b289-11cabc2da983-kube-api-access-9p9ds\") pod \"df170e16-84e4-4d9f-b289-11cabc2da983\" (UID: \"df170e16-84e4-4d9f-b289-11cabc2da983\") " Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.960523 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df170e16-84e4-4d9f-b289-11cabc2da983-scripts" (OuterVolumeSpecName: "scripts") pod "df170e16-84e4-4d9f-b289-11cabc2da983" (UID: "df170e16-84e4-4d9f-b289-11cabc2da983"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.961127 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df170e16-84e4-4d9f-b289-11cabc2da983-logs" (OuterVolumeSpecName: "logs") pod "df170e16-84e4-4d9f-b289-11cabc2da983" (UID: "df170e16-84e4-4d9f-b289-11cabc2da983"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.961602 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df170e16-84e4-4d9f-b289-11cabc2da983-config-data" (OuterVolumeSpecName: "config-data") pod "df170e16-84e4-4d9f-b289-11cabc2da983" (UID: "df170e16-84e4-4d9f-b289-11cabc2da983"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.966266 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df170e16-84e4-4d9f-b289-11cabc2da983-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "df170e16-84e4-4d9f-b289-11cabc2da983" (UID: "df170e16-84e4-4d9f-b289-11cabc2da983"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:44:52 crc kubenswrapper[4992]: I0131 09:44:52.966336 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df170e16-84e4-4d9f-b289-11cabc2da983-kube-api-access-9p9ds" (OuterVolumeSpecName: "kube-api-access-9p9ds") pod "df170e16-84e4-4d9f-b289-11cabc2da983" (UID: "df170e16-84e4-4d9f-b289-11cabc2da983"). InnerVolumeSpecName "kube-api-access-9p9ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.061526 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf4a75ee-0abf-46e9-ac05-14641a2fd782-config\") pod \"cf4a75ee-0abf-46e9-ac05-14641a2fd782\" (UID: \"cf4a75ee-0abf-46e9-ac05-14641a2fd782\") " Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.061903 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4a75ee-0abf-46e9-ac05-14641a2fd782-combined-ca-bundle\") pod \"cf4a75ee-0abf-46e9-ac05-14641a2fd782\" (UID: \"cf4a75ee-0abf-46e9-ac05-14641a2fd782\") " Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.061944 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9n9m\" (UniqueName: \"kubernetes.io/projected/cf4a75ee-0abf-46e9-ac05-14641a2fd782-kube-api-access-b9n9m\") pod \"cf4a75ee-0abf-46e9-ac05-14641a2fd782\" (UID: \"cf4a75ee-0abf-46e9-ac05-14641a2fd782\") " Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.062387 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p9ds\" (UniqueName: \"kubernetes.io/projected/df170e16-84e4-4d9f-b289-11cabc2da983-kube-api-access-9p9ds\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.062400 4992 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/df170e16-84e4-4d9f-b289-11cabc2da983-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.062410 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df170e16-84e4-4d9f-b289-11cabc2da983-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.062438 4992 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df170e16-84e4-4d9f-b289-11cabc2da983-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.062452 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df170e16-84e4-4d9f-b289-11cabc2da983-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.066056 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf4a75ee-0abf-46e9-ac05-14641a2fd782-kube-api-access-b9n9m" (OuterVolumeSpecName: "kube-api-access-b9n9m") pod "cf4a75ee-0abf-46e9-ac05-14641a2fd782" (UID: "cf4a75ee-0abf-46e9-ac05-14641a2fd782"). InnerVolumeSpecName "kube-api-access-b9n9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.084719 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4a75ee-0abf-46e9-ac05-14641a2fd782-config" (OuterVolumeSpecName: "config") pod "cf4a75ee-0abf-46e9-ac05-14641a2fd782" (UID: "cf4a75ee-0abf-46e9-ac05-14641a2fd782"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.085235 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4a75ee-0abf-46e9-ac05-14641a2fd782-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf4a75ee-0abf-46e9-ac05-14641a2fd782" (UID: "cf4a75ee-0abf-46e9-ac05-14641a2fd782"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.164076 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4a75ee-0abf-46e9-ac05-14641a2fd782-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.164112 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9n9m\" (UniqueName: \"kubernetes.io/projected/cf4a75ee-0abf-46e9-ac05-14641a2fd782-kube-api-access-b9n9m\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.164124 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf4a75ee-0abf-46e9-ac05-14641a2fd782-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.200346 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e974b81f-d9b4-4209-b9a6-72b22eec93e4" path="/var/lib/kubelet/pods/e974b81f-d9b4-4209-b9a6-72b22eec93e4/volumes" Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.809665 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4mh4k" Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.809671 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68c4b8b7d5-sdxvb" Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.858681 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68c4b8b7d5-sdxvb"] Jan 31 09:44:53 crc kubenswrapper[4992]: I0131 09:44:53.868510 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-68c4b8b7d5-sdxvb"] Jan 31 09:44:53 crc kubenswrapper[4992]: E0131 09:44:53.957430 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 31 09:44:53 crc kubenswrapper[4992]: E0131 09:44:53.957806 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bhwkv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9dxlh_openstack(57f531e7-e05e-4537-bb22-01911330abd2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 09:44:53 crc kubenswrapper[4992]: E0131 09:44:53.959391 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-9dxlh" podUID="57f531e7-e05e-4537-bb22-01911330abd2" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.050518 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-m8sjm"] Jan 31 09:44:54 crc kubenswrapper[4992]: E0131 09:44:54.050898 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4a75ee-0abf-46e9-ac05-14641a2fd782" containerName="neutron-db-sync" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.050911 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4a75ee-0abf-46e9-ac05-14641a2fd782" containerName="neutron-db-sync" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.051066 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf4a75ee-0abf-46e9-ac05-14641a2fd782" containerName="neutron-db-sync" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.054046 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.084626 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-m8sjm"] Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.180832 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5fd97c9468-24mb7"] Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.191805 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-m8sjm\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.191889 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzhkz\" (UniqueName: \"kubernetes.io/projected/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-kube-api-access-zzhkz\") pod \"dnsmasq-dns-5f66db59b9-m8sjm\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.191920 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-m8sjm\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.191962 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-config\") pod \"dnsmasq-dns-5f66db59b9-m8sjm\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.191981 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.192016 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-m8sjm\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.195213 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-9v55r" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.195901 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.195993 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.197096 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.197934 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.199712 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fd97c9468-24mb7"] Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.268999 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.295610 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-config\") pod \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.295697 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-ovsdbserver-nb\") pod \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.295749 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-dns-svc\") pod \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.295806 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gckk\" (UniqueName: \"kubernetes.io/projected/25e5d88a-2fe1-45fe-a262-bb5ef7742563-kube-api-access-2gckk\") pod \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.295860 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-ovsdbserver-sb\") pod \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\" (UID: \"25e5d88a-2fe1-45fe-a262-bb5ef7742563\") " Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.296080 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzhkz\" (UniqueName: \"kubernetes.io/projected/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-kube-api-access-zzhkz\") pod \"dnsmasq-dns-5f66db59b9-m8sjm\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.296124 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-m8sjm\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.296190 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-ovndb-tls-certs\") pod \"neutron-5fd97c9468-24mb7\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.296258 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-config\") pod \"dnsmasq-dns-5f66db59b9-m8sjm\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.296407 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-m8sjm\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.297293 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-m8sjm\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.303889 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-config\") pod \"dnsmasq-dns-5f66db59b9-m8sjm\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.305084 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-m8sjm\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.305921 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-combined-ca-bundle\") pod \"neutron-5fd97c9468-24mb7\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.306038 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-httpd-config\") pod \"neutron-5fd97c9468-24mb7\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.306574 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-m8sjm\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.306617 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cwgg\" (UniqueName: \"kubernetes.io/projected/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-kube-api-access-9cwgg\") pod \"neutron-5fd97c9468-24mb7\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.306645 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-config\") pod \"neutron-5fd97c9468-24mb7\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.316708 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e5d88a-2fe1-45fe-a262-bb5ef7742563-kube-api-access-2gckk" (OuterVolumeSpecName: "kube-api-access-2gckk") pod "25e5d88a-2fe1-45fe-a262-bb5ef7742563" (UID: "25e5d88a-2fe1-45fe-a262-bb5ef7742563"). InnerVolumeSpecName "kube-api-access-2gckk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.326837 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-m8sjm\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.348315 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzhkz\" (UniqueName: \"kubernetes.io/projected/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-kube-api-access-zzhkz\") pod \"dnsmasq-dns-5f66db59b9-m8sjm\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.363473 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-config" (OuterVolumeSpecName: "config") pod "25e5d88a-2fe1-45fe-a262-bb5ef7742563" (UID: "25e5d88a-2fe1-45fe-a262-bb5ef7742563"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.377976 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25e5d88a-2fe1-45fe-a262-bb5ef7742563" (UID: "25e5d88a-2fe1-45fe-a262-bb5ef7742563"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.388341 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25e5d88a-2fe1-45fe-a262-bb5ef7742563" (UID: "25e5d88a-2fe1-45fe-a262-bb5ef7742563"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.409018 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sbxc\" (UniqueName: \"kubernetes.io/projected/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-kube-api-access-4sbxc\") pod \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.409085 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-horizon-secret-key\") pod \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.409145 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-scripts\") pod \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.409224 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-logs\") pod \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.409249 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-config-data\") pod \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\" (UID: \"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca\") " Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.409625 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-ovndb-tls-certs\") pod \"neutron-5fd97c9468-24mb7\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.409723 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-combined-ca-bundle\") pod \"neutron-5fd97c9468-24mb7\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.409742 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-httpd-config\") pod \"neutron-5fd97c9468-24mb7\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.409788 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cwgg\" (UniqueName: \"kubernetes.io/projected/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-kube-api-access-9cwgg\") pod \"neutron-5fd97c9468-24mb7\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.409803 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-config\") pod \"neutron-5fd97c9468-24mb7\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.409848 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.409858 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gckk\" (UniqueName: \"kubernetes.io/projected/25e5d88a-2fe1-45fe-a262-bb5ef7742563-kube-api-access-2gckk\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.409869 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.409878 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.414234 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-scripts" (OuterVolumeSpecName: "scripts") pod "abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca" (UID: "abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.414546 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-logs" (OuterVolumeSpecName: "logs") pod "abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca" (UID: "abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.415077 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-config-data" (OuterVolumeSpecName: "config-data") pod "abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca" (UID: "abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.415591 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-config\") pod \"neutron-5fd97c9468-24mb7\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.418053 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-httpd-config\") pod \"neutron-5fd97c9468-24mb7\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.418571 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-kube-api-access-4sbxc" (OuterVolumeSpecName: "kube-api-access-4sbxc") pod "abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca" (UID: "abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca"). InnerVolumeSpecName "kube-api-access-4sbxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.423904 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca" (UID: "abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.424591 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-ovndb-tls-certs\") pod \"neutron-5fd97c9468-24mb7\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.432270 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-combined-ca-bundle\") pod \"neutron-5fd97c9468-24mb7\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.439177 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cwgg\" (UniqueName: \"kubernetes.io/projected/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-kube-api-access-9cwgg\") pod \"neutron-5fd97c9468-24mb7\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.443973 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25e5d88a-2fe1-45fe-a262-bb5ef7742563" (UID: "25e5d88a-2fe1-45fe-a262-bb5ef7742563"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.515391 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sbxc\" (UniqueName: \"kubernetes.io/projected/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-kube-api-access-4sbxc\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.515448 4992 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.515464 4992 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25e5d88a-2fe1-45fe-a262-bb5ef7742563-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.515475 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.515486 4992 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.515495 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.637731 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.639055 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.707449 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-52w86"] Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.733747 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d6fc5dc84-n2kln"] Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.741109 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65f6bf6f54-x2b8z"] Jan 31 09:44:54 crc kubenswrapper[4992]: W0131 09:44:54.753459 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ec5b46d_f009_46c5_a8a5_78b5b3afc50e.slice/crio-e0794a356a2117941a26b9581959a805c5acd36c307f8c344f68813f124d23f3 WatchSource:0}: Error finding container e0794a356a2117941a26b9581959a805c5acd36c307f8c344f68813f124d23f3: Status 404 returned error can't find the container with id e0794a356a2117941a26b9581959a805c5acd36c307f8c344f68813f124d23f3 Jan 31 09:44:54 crc kubenswrapper[4992]: W0131 09:44:54.760145 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ff2a8b_a743_475e_9ae5_5fb98839ba57.slice/crio-7d2291052b458b7f0597ada1d75c141764c44393a2ea1260a5b192420e953f59 WatchSource:0}: Error finding container 7d2291052b458b7f0597ada1d75c141764c44393a2ea1260a5b192420e953f59: Status 404 returned error can't find the container with id 7d2291052b458b7f0597ada1d75c141764c44393a2ea1260a5b192420e953f59 Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.823347 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" event={"ID":"25e5d88a-2fe1-45fe-a262-bb5ef7742563","Type":"ContainerDied","Data":"5a4d502db8c22ea4bf41071c2e5d5dc3b546a27ef85841d047ef3f7564c101e9"} Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.823672 4992 scope.go:117] "RemoveContainer" containerID="8cda2b321cd22de41db6883e58f0e8a1afc3e092dd796236a9f3c64668bdea5f" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.823402 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554567b4f7-9nv8v" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.824794 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"014742b0-d197-4f2d-9186-3fb1daa4318e","Type":"ContainerStarted","Data":"20fa91fbba6cc74e59aebc509d7cbb6437d5c60aadc4238709bd45e2c190fe78"} Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.826106 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65f6bf6f54-x2b8z" event={"ID":"04ff2a8b-a743-475e-9ae5-5fb98839ba57","Type":"ContainerStarted","Data":"7d2291052b458b7f0597ada1d75c141764c44393a2ea1260a5b192420e953f59"} Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.834856 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-52w86" event={"ID":"cf1be806-c02e-4606-94ea-438caf8ef9c6","Type":"ContainerStarted","Data":"b345fd1a0a8caea6c211bd746ca6ab03ed3693d3d91df168abbb103b11aa700a"} Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.835784 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-96867c577-xw6dh" event={"ID":"abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca","Type":"ContainerDied","Data":"586b8b7201f643cb621a0b6bc1d70e990a8b182a51dca43a35345c3ac66281a7"} Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.835842 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-96867c577-xw6dh" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.850045 4992 scope.go:117] "RemoveContainer" containerID="fafcd8c478aa5fb4b8a11cea115a0511b557a1d7e80558d5d26bee5d9f02f0d1" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.851002 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d6fc5dc84-n2kln" event={"ID":"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e","Type":"ContainerStarted","Data":"e0794a356a2117941a26b9581959a805c5acd36c307f8c344f68813f124d23f3"} Jan 31 09:44:54 crc kubenswrapper[4992]: E0131 09:44:54.861725 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-9dxlh" podUID="57f531e7-e05e-4537-bb22-01911330abd2" Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.911242 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-9nv8v"] Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.923385 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-554567b4f7-9nv8v"] Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.969484 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-96867c577-xw6dh"] Jan 31 09:44:54 crc kubenswrapper[4992]: I0131 09:44:54.976563 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-96867c577-xw6dh"] Jan 31 09:44:55 crc kubenswrapper[4992]: I0131 09:44:55.119172 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-m8sjm"] Jan 31 09:44:55 crc kubenswrapper[4992]: W0131 09:44:55.121148 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a93fcde_6251_4bfd_8b00_e8c1b7f233b3.slice/crio-d1b0de2dca7fc3735cfe33ab239456f9530332e08ecc5be29492a91a67ada5c0 WatchSource:0}: Error finding container d1b0de2dca7fc3735cfe33ab239456f9530332e08ecc5be29492a91a67ada5c0: Status 404 returned error can't find the container with id d1b0de2dca7fc3735cfe33ab239456f9530332e08ecc5be29492a91a67ada5c0 Jan 31 09:44:55 crc kubenswrapper[4992]: I0131 09:44:55.201514 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e5d88a-2fe1-45fe-a262-bb5ef7742563" path="/var/lib/kubelet/pods/25e5d88a-2fe1-45fe-a262-bb5ef7742563/volumes" Jan 31 09:44:55 crc kubenswrapper[4992]: I0131 09:44:55.202895 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca" path="/var/lib/kubelet/pods/abe80ec8-5bd1-4dda-b9c7-0295e8ad1dca/volumes" Jan 31 09:44:55 crc kubenswrapper[4992]: I0131 09:44:55.203580 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df170e16-84e4-4d9f-b289-11cabc2da983" path="/var/lib/kubelet/pods/df170e16-84e4-4d9f-b289-11cabc2da983/volumes" Jan 31 09:44:55 crc kubenswrapper[4992]: I0131 09:44:55.465389 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fd97c9468-24mb7"] Jan 31 09:44:55 crc kubenswrapper[4992]: I0131 09:44:55.857247 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65f6bf6f54-x2b8z" event={"ID":"04ff2a8b-a743-475e-9ae5-5fb98839ba57","Type":"ContainerStarted","Data":"235244b99bd6545b56a68bbe33b171a8a66da4e113507373bfd664c22bb694dc"} Jan 31 09:44:55 crc kubenswrapper[4992]: I0131 09:44:55.865182 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-52w86" event={"ID":"cf1be806-c02e-4606-94ea-438caf8ef9c6","Type":"ContainerStarted","Data":"5085c4e9275d593ec2f9e83b58a2cdfefb630195561b76133d1f4174685ae5a0"} Jan 31 09:44:55 crc kubenswrapper[4992]: I0131 09:44:55.886899 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fd97c9468-24mb7" event={"ID":"c7c5aa9b-d41b-4820-9156-a42c3e79bb38","Type":"ContainerStarted","Data":"05e403d26a756dfab5748e94d7e335757ac7836aefb4ea2bfd8e22d03bf85912"} Jan 31 09:44:55 crc kubenswrapper[4992]: I0131 09:44:55.886943 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fd97c9468-24mb7" event={"ID":"c7c5aa9b-d41b-4820-9156-a42c3e79bb38","Type":"ContainerStarted","Data":"fe08127f28f1b17d5d01ca4fce1334a16ebb1af2867c2ccee4bb5fa887a3602f"} Jan 31 09:44:55 crc kubenswrapper[4992]: I0131 09:44:55.893873 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d6fc5dc84-n2kln" event={"ID":"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e","Type":"ContainerStarted","Data":"e4d72d255fdfa78cfbb10953afaff299c7be2a55bdcbfd549b2f17e1bf3677fb"} Jan 31 09:44:55 crc kubenswrapper[4992]: I0131 09:44:55.895546 4992 generic.go:334] "Generic (PLEG): container finished" podID="6a93fcde-6251-4bfd-8b00-e8c1b7f233b3" containerID="4f23ef9dc3234c0368c8e17634eba80dc7303ca9533175841a6bc6ec0c72dea9" exitCode=0 Jan 31 09:44:55 crc kubenswrapper[4992]: I0131 09:44:55.895601 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" event={"ID":"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3","Type":"ContainerDied","Data":"4f23ef9dc3234c0368c8e17634eba80dc7303ca9533175841a6bc6ec0c72dea9"} Jan 31 09:44:55 crc kubenswrapper[4992]: I0131 09:44:55.895618 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" event={"ID":"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3","Type":"ContainerStarted","Data":"d1b0de2dca7fc3735cfe33ab239456f9530332e08ecc5be29492a91a67ada5c0"} Jan 31 09:44:55 crc kubenswrapper[4992]: I0131 09:44:55.896890 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-52w86" podStartSLOduration=15.896874205 podStartE2EDuration="15.896874205s" podCreationTimestamp="2026-01-31 09:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:44:55.887081843 +0000 UTC m=+1191.858473830" watchObservedRunningTime="2026-01-31 09:44:55.896874205 +0000 UTC m=+1191.868266192" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.488704 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-694d5dc9d5-kr7sf"] Jan 31 09:44:56 crc kubenswrapper[4992]: E0131 09:44:56.489524 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e5d88a-2fe1-45fe-a262-bb5ef7742563" containerName="init" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.489542 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e5d88a-2fe1-45fe-a262-bb5ef7742563" containerName="init" Jan 31 09:44:56 crc kubenswrapper[4992]: E0131 09:44:56.489556 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e5d88a-2fe1-45fe-a262-bb5ef7742563" containerName="dnsmasq-dns" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.489564 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e5d88a-2fe1-45fe-a262-bb5ef7742563" containerName="dnsmasq-dns" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.489760 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e5d88a-2fe1-45fe-a262-bb5ef7742563" containerName="dnsmasq-dns" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.490800 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.495970 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.496145 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.506496 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-694d5dc9d5-kr7sf"] Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.593020 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-public-tls-certs\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.593592 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-ovndb-tls-certs\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.593637 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-combined-ca-bundle\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.594057 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-httpd-config\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.594093 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-config\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.594134 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq97v\" (UniqueName: \"kubernetes.io/projected/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-kube-api-access-qq97v\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.594183 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-internal-tls-certs\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.695435 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-internal-tls-certs\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.695521 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-public-tls-certs\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.695539 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-ovndb-tls-certs\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.695566 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-combined-ca-bundle\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.695594 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-httpd-config\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.695614 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-config\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.695661 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq97v\" (UniqueName: \"kubernetes.io/projected/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-kube-api-access-qq97v\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.703389 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-combined-ca-bundle\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.704441 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-internal-tls-certs\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.704472 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-ovndb-tls-certs\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.704546 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-public-tls-certs\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.705363 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-httpd-config\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.707637 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-config\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.726540 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq97v\" (UniqueName: \"kubernetes.io/projected/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-kube-api-access-qq97v\") pod \"neutron-694d5dc9d5-kr7sf\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.826556 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.923950 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fd97c9468-24mb7" event={"ID":"c7c5aa9b-d41b-4820-9156-a42c3e79bb38","Type":"ContainerStarted","Data":"c4aa861035566b13a22a319963273e108134e419f8b6ae9c3a02c7a8d8a2ebe3"} Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.924981 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.933317 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d6fc5dc84-n2kln" event={"ID":"7ec5b46d-f009-46c5-a8a5-78b5b3afc50e","Type":"ContainerStarted","Data":"c3936977abfca4cfd9564800faf36e4c00fdf3eaf42d25f9ec9d32a9d10396b1"} Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.938962 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" event={"ID":"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3","Type":"ContainerStarted","Data":"4a9148aae46144f5a3fbd2baca2ad1b7cb5a1c167629337ce4eed1b9d7243b4a"} Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.940582 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.959659 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"014742b0-d197-4f2d-9186-3fb1daa4318e","Type":"ContainerStarted","Data":"06b73264b209677c52611b4cddf3bccca55e8e0c96743ceca7fdff048ee2011d"} Jan 31 09:44:56 crc kubenswrapper[4992]: I0131 09:44:56.963056 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5fd97c9468-24mb7" podStartSLOduration=2.963044699 podStartE2EDuration="2.963044699s" podCreationTimestamp="2026-01-31 09:44:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:44:56.954956876 +0000 UTC m=+1192.926348873" watchObservedRunningTime="2026-01-31 09:44:56.963044699 +0000 UTC m=+1192.934436686" Jan 31 09:44:57 crc kubenswrapper[4992]: I0131 09:44:57.000940 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65f6bf6f54-x2b8z" event={"ID":"04ff2a8b-a743-475e-9ae5-5fb98839ba57","Type":"ContainerStarted","Data":"ca25af36153839640483f7960a573ac8a9e0baf0669dab993d3eee4217e0f72d"} Jan 31 09:44:57 crc kubenswrapper[4992]: I0131 09:44:57.024243 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" podStartSLOduration=3.024217901 podStartE2EDuration="3.024217901s" podCreationTimestamp="2026-01-31 09:44:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:44:56.9988507 +0000 UTC m=+1192.970242687" watchObservedRunningTime="2026-01-31 09:44:57.024217901 +0000 UTC m=+1192.995609898" Jan 31 09:44:57 crc kubenswrapper[4992]: I0131 09:44:57.064554 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5d6fc5dc84-n2kln" podStartSLOduration=24.311899015 podStartE2EDuration="25.064534533s" podCreationTimestamp="2026-01-31 09:44:32 +0000 UTC" firstStartedPulling="2026-01-31 09:44:54.763885656 +0000 UTC m=+1190.735277643" lastFinishedPulling="2026-01-31 09:44:55.516521174 +0000 UTC m=+1191.487913161" observedRunningTime="2026-01-31 09:44:57.036891617 +0000 UTC m=+1193.008283614" watchObservedRunningTime="2026-01-31 09:44:57.064534533 +0000 UTC m=+1193.035926520" Jan 31 09:44:57 crc kubenswrapper[4992]: I0131 09:44:57.075274 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-65f6bf6f54-x2b8z" podStartSLOduration=24.192967007 podStartE2EDuration="25.075257762s" podCreationTimestamp="2026-01-31 09:44:32 +0000 UTC" firstStartedPulling="2026-01-31 09:44:54.763956798 +0000 UTC m=+1190.735348785" lastFinishedPulling="2026-01-31 09:44:55.646247553 +0000 UTC m=+1191.617639540" observedRunningTime="2026-01-31 09:44:57.069298631 +0000 UTC m=+1193.040690638" watchObservedRunningTime="2026-01-31 09:44:57.075257762 +0000 UTC m=+1193.046649749" Jan 31 09:44:57 crc kubenswrapper[4992]: I0131 09:44:57.554455 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-694d5dc9d5-kr7sf"] Jan 31 09:44:58 crc kubenswrapper[4992]: I0131 09:44:58.010004 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-694d5dc9d5-kr7sf" event={"ID":"3665c963-d0e3-4317-9bd8-50cc6d7bff5a","Type":"ContainerStarted","Data":"59db46195bce2b5d41af5cf68bff7eda52196ee04400f335a9a7f409d7bbf23f"} Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.035739 4992 generic.go:334] "Generic (PLEG): container finished" podID="cf1be806-c02e-4606-94ea-438caf8ef9c6" containerID="5085c4e9275d593ec2f9e83b58a2cdfefb630195561b76133d1f4174685ae5a0" exitCode=0 Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.035791 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-52w86" event={"ID":"cf1be806-c02e-4606-94ea-438caf8ef9c6","Type":"ContainerDied","Data":"5085c4e9275d593ec2f9e83b58a2cdfefb630195561b76133d1f4174685ae5a0"} Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.041830 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mbgzl" event={"ID":"db1e7454-fec7-4ec7-a2e2-5e4ebb145213","Type":"ContainerStarted","Data":"a5becdc9ce36f7d8808f5ce6cf9e9e9a2a315551ba934d4e45a418cedea58924"} Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.058243 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-694d5dc9d5-kr7sf" event={"ID":"3665c963-d0e3-4317-9bd8-50cc6d7bff5a","Type":"ContainerStarted","Data":"7530863dd25130eca663eeecef851785104f103d648bcad98c3e2140af7b6b11"} Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.058297 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-694d5dc9d5-kr7sf" event={"ID":"3665c963-d0e3-4317-9bd8-50cc6d7bff5a","Type":"ContainerStarted","Data":"239b180e3c7322ffc9253cb2e4049d21ad52ac110469b2670d6b859cbcadf901"} Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.058546 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.074200 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-mbgzl" podStartSLOduration=2.000417641 podStartE2EDuration="36.074184561s" podCreationTimestamp="2026-01-31 09:44:24 +0000 UTC" firstStartedPulling="2026-01-31 09:44:25.422721374 +0000 UTC m=+1161.394113371" lastFinishedPulling="2026-01-31 09:44:59.496488304 +0000 UTC m=+1195.467880291" observedRunningTime="2026-01-31 09:45:00.068568299 +0000 UTC m=+1196.039960286" watchObservedRunningTime="2026-01-31 09:45:00.074184561 +0000 UTC m=+1196.045576548" Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.103678 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-694d5dc9d5-kr7sf" podStartSLOduration=4.103626749 podStartE2EDuration="4.103626749s" podCreationTimestamp="2026-01-31 09:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:45:00.095003851 +0000 UTC m=+1196.066395848" watchObservedRunningTime="2026-01-31 09:45:00.103626749 +0000 UTC m=+1196.075018746" Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.162815 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw"] Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.164159 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.166554 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.166676 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.170895 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw"] Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.258403 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0f0b912-33a5-4498-84cf-e2f859245bb6-secret-volume\") pod \"collect-profiles-29497545-bmsmw\" (UID: \"d0f0b912-33a5-4498-84cf-e2f859245bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.258548 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0f0b912-33a5-4498-84cf-e2f859245bb6-config-volume\") pod \"collect-profiles-29497545-bmsmw\" (UID: \"d0f0b912-33a5-4498-84cf-e2f859245bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.258668 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj229\" (UniqueName: \"kubernetes.io/projected/d0f0b912-33a5-4498-84cf-e2f859245bb6-kube-api-access-sj229\") pod \"collect-profiles-29497545-bmsmw\" (UID: \"d0f0b912-33a5-4498-84cf-e2f859245bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.360358 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0f0b912-33a5-4498-84cf-e2f859245bb6-secret-volume\") pod \"collect-profiles-29497545-bmsmw\" (UID: \"d0f0b912-33a5-4498-84cf-e2f859245bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.360487 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0f0b912-33a5-4498-84cf-e2f859245bb6-config-volume\") pod \"collect-profiles-29497545-bmsmw\" (UID: \"d0f0b912-33a5-4498-84cf-e2f859245bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.360628 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj229\" (UniqueName: \"kubernetes.io/projected/d0f0b912-33a5-4498-84cf-e2f859245bb6-kube-api-access-sj229\") pod \"collect-profiles-29497545-bmsmw\" (UID: \"d0f0b912-33a5-4498-84cf-e2f859245bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.361747 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0f0b912-33a5-4498-84cf-e2f859245bb6-config-volume\") pod \"collect-profiles-29497545-bmsmw\" (UID: \"d0f0b912-33a5-4498-84cf-e2f859245bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.365468 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0f0b912-33a5-4498-84cf-e2f859245bb6-secret-volume\") pod \"collect-profiles-29497545-bmsmw\" (UID: \"d0f0b912-33a5-4498-84cf-e2f859245bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.378835 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj229\" (UniqueName: \"kubernetes.io/projected/d0f0b912-33a5-4498-84cf-e2f859245bb6-kube-api-access-sj229\") pod \"collect-profiles-29497545-bmsmw\" (UID: \"d0f0b912-33a5-4498-84cf-e2f859245bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" Jan 31 09:45:00 crc kubenswrapper[4992]: I0131 09:45:00.479536 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" Jan 31 09:45:01 crc kubenswrapper[4992]: I0131 09:45:01.081327 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw"] Jan 31 09:45:02 crc kubenswrapper[4992]: I0131 09:45:02.094724 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" event={"ID":"d0f0b912-33a5-4498-84cf-e2f859245bb6","Type":"ContainerStarted","Data":"6e388c48a933bc8f3f6d81f9dc8a3cd00d8826fe2b11a5e7705a3e2531d46aff"} Jan 31 09:45:02 crc kubenswrapper[4992]: I0131 09:45:02.829264 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:45:02 crc kubenswrapper[4992]: I0131 09:45:02.829311 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:45:02 crc kubenswrapper[4992]: I0131 09:45:02.957596 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:45:02 crc kubenswrapper[4992]: I0131 09:45:02.959038 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:45:03 crc kubenswrapper[4992]: I0131 09:45:03.110065 4992 generic.go:334] "Generic (PLEG): container finished" podID="db1e7454-fec7-4ec7-a2e2-5e4ebb145213" containerID="a5becdc9ce36f7d8808f5ce6cf9e9e9a2a315551ba934d4e45a418cedea58924" exitCode=0 Jan 31 09:45:03 crc kubenswrapper[4992]: I0131 09:45:03.110155 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mbgzl" event={"ID":"db1e7454-fec7-4ec7-a2e2-5e4ebb145213","Type":"ContainerDied","Data":"a5becdc9ce36f7d8808f5ce6cf9e9e9a2a315551ba934d4e45a418cedea58924"} Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:03.999971 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-52w86" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.024351 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-config-data\") pod \"cf1be806-c02e-4606-94ea-438caf8ef9c6\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.024392 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-combined-ca-bundle\") pod \"cf1be806-c02e-4606-94ea-438caf8ef9c6\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.024438 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-scripts\") pod \"cf1be806-c02e-4606-94ea-438caf8ef9c6\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.024461 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-fernet-keys\") pod \"cf1be806-c02e-4606-94ea-438caf8ef9c6\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.024487 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-credential-keys\") pod \"cf1be806-c02e-4606-94ea-438caf8ef9c6\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.024523 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grlnd\" (UniqueName: \"kubernetes.io/projected/cf1be806-c02e-4606-94ea-438caf8ef9c6-kube-api-access-grlnd\") pod \"cf1be806-c02e-4606-94ea-438caf8ef9c6\" (UID: \"cf1be806-c02e-4606-94ea-438caf8ef9c6\") " Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.028845 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-scripts" (OuterVolumeSpecName: "scripts") pod "cf1be806-c02e-4606-94ea-438caf8ef9c6" (UID: "cf1be806-c02e-4606-94ea-438caf8ef9c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.041585 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cf1be806-c02e-4606-94ea-438caf8ef9c6" (UID: "cf1be806-c02e-4606-94ea-438caf8ef9c6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.048871 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cf1be806-c02e-4606-94ea-438caf8ef9c6" (UID: "cf1be806-c02e-4606-94ea-438caf8ef9c6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.050661 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf1be806-c02e-4606-94ea-438caf8ef9c6-kube-api-access-grlnd" (OuterVolumeSpecName: "kube-api-access-grlnd") pod "cf1be806-c02e-4606-94ea-438caf8ef9c6" (UID: "cf1be806-c02e-4606-94ea-438caf8ef9c6"). InnerVolumeSpecName "kube-api-access-grlnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.070146 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf1be806-c02e-4606-94ea-438caf8ef9c6" (UID: "cf1be806-c02e-4606-94ea-438caf8ef9c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.098690 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-config-data" (OuterVolumeSpecName: "config-data") pod "cf1be806-c02e-4606-94ea-438caf8ef9c6" (UID: "cf1be806-c02e-4606-94ea-438caf8ef9c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.124109 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"014742b0-d197-4f2d-9186-3fb1daa4318e","Type":"ContainerStarted","Data":"7b937a4004691f2f165084eb4019197a06c9318f767b4288d38811fb1be2cf24"} Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.125513 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.125550 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.125565 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.125581 4992 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.125592 4992 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cf1be806-c02e-4606-94ea-438caf8ef9c6-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.125604 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grlnd\" (UniqueName: \"kubernetes.io/projected/cf1be806-c02e-4606-94ea-438caf8ef9c6-kube-api-access-grlnd\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.125879 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-52w86" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.125908 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-52w86" event={"ID":"cf1be806-c02e-4606-94ea-438caf8ef9c6","Type":"ContainerDied","Data":"b345fd1a0a8caea6c211bd746ca6ab03ed3693d3d91df168abbb103b11aa700a"} Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.125981 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b345fd1a0a8caea6c211bd746ca6ab03ed3693d3d91df168abbb103b11aa700a" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.408260 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mbgzl" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.537263 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-logs\") pod \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.537330 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-scripts\") pod \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.537403 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-combined-ca-bundle\") pod \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.537492 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s9mg\" (UniqueName: \"kubernetes.io/projected/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-kube-api-access-4s9mg\") pod \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.537530 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-config-data\") pod \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\" (UID: \"db1e7454-fec7-4ec7-a2e2-5e4ebb145213\") " Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.538868 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-logs" (OuterVolumeSpecName: "logs") pod "db1e7454-fec7-4ec7-a2e2-5e4ebb145213" (UID: "db1e7454-fec7-4ec7-a2e2-5e4ebb145213"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.544569 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-kube-api-access-4s9mg" (OuterVolumeSpecName: "kube-api-access-4s9mg") pod "db1e7454-fec7-4ec7-a2e2-5e4ebb145213" (UID: "db1e7454-fec7-4ec7-a2e2-5e4ebb145213"). InnerVolumeSpecName "kube-api-access-4s9mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.544578 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-scripts" (OuterVolumeSpecName: "scripts") pod "db1e7454-fec7-4ec7-a2e2-5e4ebb145213" (UID: "db1e7454-fec7-4ec7-a2e2-5e4ebb145213"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.565367 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-config-data" (OuterVolumeSpecName: "config-data") pod "db1e7454-fec7-4ec7-a2e2-5e4ebb145213" (UID: "db1e7454-fec7-4ec7-a2e2-5e4ebb145213"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.580623 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db1e7454-fec7-4ec7-a2e2-5e4ebb145213" (UID: "db1e7454-fec7-4ec7-a2e2-5e4ebb145213"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.640120 4992 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.640167 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.640179 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.640193 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s9mg\" (UniqueName: \"kubernetes.io/projected/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-kube-api-access-4s9mg\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.640203 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1e7454-fec7-4ec7-a2e2-5e4ebb145213-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.641577 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.697388 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-n2kll"] Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.697616 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" podUID="c0b24a72-cd6e-425a-ac8f-f810990eb8df" containerName="dnsmasq-dns" containerID="cri-o://c1bf1f06d94c1a57ef50cf1a6f46f96d135d9ed79f0f3b54ba6260b166312c71" gracePeriod=10 Jan 31 09:45:04 crc kubenswrapper[4992]: I0131 09:45:04.808840 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" podUID="c0b24a72-cd6e-425a-ac8f-f810990eb8df" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.142365 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mbgzl" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.142753 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mbgzl" event={"ID":"db1e7454-fec7-4ec7-a2e2-5e4ebb145213","Type":"ContainerDied","Data":"368e6cc17c9da448460aa7beb099f3d3acf28c3ebba95fcb99501df2b70a0b7c"} Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.142783 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="368e6cc17c9da448460aa7beb099f3d3acf28c3ebba95fcb99501df2b70a0b7c" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.146219 4992 generic.go:334] "Generic (PLEG): container finished" podID="d0f0b912-33a5-4498-84cf-e2f859245bb6" containerID="c8eb930ea3eb877bff2fbce7519ca54fb8f063e14b38f3e0ffb4d680157df299" exitCode=0 Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.146276 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" event={"ID":"d0f0b912-33a5-4498-84cf-e2f859245bb6","Type":"ContainerDied","Data":"c8eb930ea3eb877bff2fbce7519ca54fb8f063e14b38f3e0ffb4d680157df299"} Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.148055 4992 generic.go:334] "Generic (PLEG): container finished" podID="c0b24a72-cd6e-425a-ac8f-f810990eb8df" containerID="c1bf1f06d94c1a57ef50cf1a6f46f96d135d9ed79f0f3b54ba6260b166312c71" exitCode=0 Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.148097 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" event={"ID":"c0b24a72-cd6e-425a-ac8f-f810990eb8df","Type":"ContainerDied","Data":"c1bf1f06d94c1a57ef50cf1a6f46f96d135d9ed79f0f3b54ba6260b166312c71"} Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.148113 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" event={"ID":"c0b24a72-cd6e-425a-ac8f-f810990eb8df","Type":"ContainerDied","Data":"cf65f5e840bc599af4190cfecba0374a819c44a97fc66163cbd9b19652397b55"} Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.148123 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf65f5e840bc599af4190cfecba0374a819c44a97fc66163cbd9b19652397b55" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.149100 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lbgq7" event={"ID":"02f6c85a-822e-4864-b0aa-1c487d73721c","Type":"ContainerStarted","Data":"2d5b2cb07e909d54987e1d03967060cd8eb2bb0c5bd742d43d89cf64e66516c8"} Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.205698 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.207489 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-55c8cdc56b-dkph6"] Jan 31 09:45:05 crc kubenswrapper[4992]: E0131 09:45:05.207858 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1be806-c02e-4606-94ea-438caf8ef9c6" containerName="keystone-bootstrap" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.207882 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1be806-c02e-4606-94ea-438caf8ef9c6" containerName="keystone-bootstrap" Jan 31 09:45:05 crc kubenswrapper[4992]: E0131 09:45:05.207895 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b24a72-cd6e-425a-ac8f-f810990eb8df" containerName="dnsmasq-dns" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.207903 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b24a72-cd6e-425a-ac8f-f810990eb8df" containerName="dnsmasq-dns" Jan 31 09:45:05 crc kubenswrapper[4992]: E0131 09:45:05.213790 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1e7454-fec7-4ec7-a2e2-5e4ebb145213" containerName="placement-db-sync" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.213827 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1e7454-fec7-4ec7-a2e2-5e4ebb145213" containerName="placement-db-sync" Jan 31 09:45:05 crc kubenswrapper[4992]: E0131 09:45:05.213885 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b24a72-cd6e-425a-ac8f-f810990eb8df" containerName="init" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.213897 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b24a72-cd6e-425a-ac8f-f810990eb8df" containerName="init" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.214261 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1e7454-fec7-4ec7-a2e2-5e4ebb145213" containerName="placement-db-sync" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.214289 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0b24a72-cd6e-425a-ac8f-f810990eb8df" containerName="dnsmasq-dns" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.214305 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1be806-c02e-4606-94ea-438caf8ef9c6" containerName="keystone-bootstrap" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.214639 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-lbgq7" podStartSLOduration=2.053722977 podStartE2EDuration="41.2146225s" podCreationTimestamp="2026-01-31 09:44:24 +0000 UTC" firstStartedPulling="2026-01-31 09:44:25.441426793 +0000 UTC m=+1161.412818780" lastFinishedPulling="2026-01-31 09:45:04.602326306 +0000 UTC m=+1200.573718303" observedRunningTime="2026-01-31 09:45:05.206376583 +0000 UTC m=+1201.177768950" watchObservedRunningTime="2026-01-31 09:45:05.2146225 +0000 UTC m=+1201.186014487" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.216351 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.229074 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.229124 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.229074 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-gkf5n" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.229650 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.229900 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.230152 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.253497 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55c8cdc56b-dkph6"] Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.321576 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-69d4477cc6-bk8rk"] Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.323669 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.329006 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.329230 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-p8h77" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.329389 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.332364 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.332580 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.347507 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-69d4477cc6-bk8rk"] Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.381810 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-config\") pod \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.381901 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-ovsdbserver-nb\") pod \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.382005 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-ovsdbserver-sb\") pod \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.382075 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skw88\" (UniqueName: \"kubernetes.io/projected/c0b24a72-cd6e-425a-ac8f-f810990eb8df-kube-api-access-skw88\") pod \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.383922 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-dns-svc\") pod \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\" (UID: \"c0b24a72-cd6e-425a-ac8f-f810990eb8df\") " Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.384258 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-public-tls-certs\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.384304 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-combined-ca-bundle\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.384332 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-scripts\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.384412 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-scripts\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.384490 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-internal-tls-certs\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.384525 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxn6w\" (UniqueName: \"kubernetes.io/projected/1fdcaec9-5cd0-4117-bb84-413a80a5860c-kube-api-access-lxn6w\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.384561 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fdcaec9-5cd0-4117-bb84-413a80a5860c-logs\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.384591 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-public-tls-certs\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.384618 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-credential-keys\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.384672 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-combined-ca-bundle\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.384723 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-config-data\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.384759 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-internal-tls-certs\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.384797 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-fernet-keys\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.384888 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-config-data\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.384926 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn5l6\" (UniqueName: \"kubernetes.io/projected/e269f327-9779-4b47-ab5b-bfae29d5bcb4-kube-api-access-tn5l6\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.401774 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0b24a72-cd6e-425a-ac8f-f810990eb8df-kube-api-access-skw88" (OuterVolumeSpecName: "kube-api-access-skw88") pod "c0b24a72-cd6e-425a-ac8f-f810990eb8df" (UID: "c0b24a72-cd6e-425a-ac8f-f810990eb8df"). InnerVolumeSpecName "kube-api-access-skw88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.434344 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c0b24a72-cd6e-425a-ac8f-f810990eb8df" (UID: "c0b24a72-cd6e-425a-ac8f-f810990eb8df"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.437145 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0b24a72-cd6e-425a-ac8f-f810990eb8df" (UID: "c0b24a72-cd6e-425a-ac8f-f810990eb8df"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.457466 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-config" (OuterVolumeSpecName: "config") pod "c0b24a72-cd6e-425a-ac8f-f810990eb8df" (UID: "c0b24a72-cd6e-425a-ac8f-f810990eb8df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.473601 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c0b24a72-cd6e-425a-ac8f-f810990eb8df" (UID: "c0b24a72-cd6e-425a-ac8f-f810990eb8df"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486441 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-scripts\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486506 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-internal-tls-certs\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486528 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxn6w\" (UniqueName: \"kubernetes.io/projected/1fdcaec9-5cd0-4117-bb84-413a80a5860c-kube-api-access-lxn6w\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486550 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fdcaec9-5cd0-4117-bb84-413a80a5860c-logs\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486570 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-public-tls-certs\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486586 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-credential-keys\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486613 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-combined-ca-bundle\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486636 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-config-data\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486655 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-internal-tls-certs\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486677 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-fernet-keys\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486713 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-config-data\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486734 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn5l6\" (UniqueName: \"kubernetes.io/projected/e269f327-9779-4b47-ab5b-bfae29d5bcb4-kube-api-access-tn5l6\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486770 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-public-tls-certs\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486793 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-combined-ca-bundle\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486819 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-scripts\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486870 4992 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486881 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486892 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486902 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0b24a72-cd6e-425a-ac8f-f810990eb8df-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.486911 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skw88\" (UniqueName: \"kubernetes.io/projected/c0b24a72-cd6e-425a-ac8f-f810990eb8df-kube-api-access-skw88\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.489498 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-internal-tls-certs\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.490900 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fdcaec9-5cd0-4117-bb84-413a80a5860c-logs\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.491518 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-scripts\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.492216 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-scripts\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.492239 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-combined-ca-bundle\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.498186 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-internal-tls-certs\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.498655 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-public-tls-certs\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.498982 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-combined-ca-bundle\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.499484 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-credential-keys\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.499503 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-fernet-keys\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.499673 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e269f327-9779-4b47-ab5b-bfae29d5bcb4-config-data\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.504962 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-config-data\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.506976 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-public-tls-certs\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.509726 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn5l6\" (UniqueName: \"kubernetes.io/projected/e269f327-9779-4b47-ab5b-bfae29d5bcb4-kube-api-access-tn5l6\") pod \"keystone-55c8cdc56b-dkph6\" (UID: \"e269f327-9779-4b47-ab5b-bfae29d5bcb4\") " pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.514343 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxn6w\" (UniqueName: \"kubernetes.io/projected/1fdcaec9-5cd0-4117-bb84-413a80a5860c-kube-api-access-lxn6w\") pod \"placement-69d4477cc6-bk8rk\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.534032 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.601296 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7796988564-hnmhv"] Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.604654 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.617935 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7796988564-hnmhv"] Jan 31 09:45:05 crc kubenswrapper[4992]: I0131 09:45:05.666287 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.792312 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-public-tls-certs\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.792357 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-internal-tls-certs\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.792380 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-config-data\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.792447 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smhdx\" (UniqueName: \"kubernetes.io/projected/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-kube-api-access-smhdx\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.792477 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-scripts\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.792507 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-logs\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.792530 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-combined-ca-bundle\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.894382 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-logs\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.894453 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-combined-ca-bundle\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.894523 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-public-tls-certs\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.894545 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-internal-tls-certs\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.894567 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-config-data\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.894626 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smhdx\" (UniqueName: \"kubernetes.io/projected/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-kube-api-access-smhdx\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.894662 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-scripts\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.899608 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-logs\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.900832 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-internal-tls-certs\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.901573 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-scripts\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.902676 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-public-tls-certs\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.903759 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-config-data\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.908985 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-combined-ca-bundle\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.915553 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smhdx\" (UniqueName: \"kubernetes.io/projected/f09da6f2-c367-45e0-8293-e2b6a9b9df2c-kube-api-access-smhdx\") pod \"placement-7796988564-hnmhv\" (UID: \"f09da6f2-c367-45e0-8293-e2b6a9b9df2c\") " pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:05.928502 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:06.063958 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55c8cdc56b-dkph6"] Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:06.160309 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55c8cdc56b-dkph6" event={"ID":"e269f327-9779-4b47-ab5b-bfae29d5bcb4","Type":"ContainerStarted","Data":"7338f112959188b299c8f8eb5ed7d50541f29c85446057e6b2d2354ce6755f18"} Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:06.160369 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-n2kll" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:06.203732 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-n2kll"] Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:06.215326 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-n2kll"] Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:07.172123 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55c8cdc56b-dkph6" event={"ID":"e269f327-9779-4b47-ab5b-bfae29d5bcb4","Type":"ContainerStarted","Data":"8ebd903e941a326287496d698f55b2c574ab80dc0b2c91327cc39f2859eef1ef"} Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:07.172955 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:07.199563 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-55c8cdc56b-dkph6" podStartSLOduration=2.199522769 podStartE2EDuration="2.199522769s" podCreationTimestamp="2026-01-31 09:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:45:07.192986921 +0000 UTC m=+1203.164378928" watchObservedRunningTime="2026-01-31 09:45:07.199522769 +0000 UTC m=+1203.170914776" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:07.208831 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0b24a72-cd6e-425a-ac8f-f810990eb8df" path="/var/lib/kubelet/pods/c0b24a72-cd6e-425a-ac8f-f810990eb8df/volumes" Jan 31 09:45:07 crc kubenswrapper[4992]: I0131 09:45:07.968102 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.044190 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0f0b912-33a5-4498-84cf-e2f859245bb6-secret-volume\") pod \"d0f0b912-33a5-4498-84cf-e2f859245bb6\" (UID: \"d0f0b912-33a5-4498-84cf-e2f859245bb6\") " Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.044724 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0f0b912-33a5-4498-84cf-e2f859245bb6-config-volume\") pod \"d0f0b912-33a5-4498-84cf-e2f859245bb6\" (UID: \"d0f0b912-33a5-4498-84cf-e2f859245bb6\") " Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.044776 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj229\" (UniqueName: \"kubernetes.io/projected/d0f0b912-33a5-4498-84cf-e2f859245bb6-kube-api-access-sj229\") pod \"d0f0b912-33a5-4498-84cf-e2f859245bb6\" (UID: \"d0f0b912-33a5-4498-84cf-e2f859245bb6\") " Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.045562 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0f0b912-33a5-4498-84cf-e2f859245bb6-config-volume" (OuterVolumeSpecName: "config-volume") pod "d0f0b912-33a5-4498-84cf-e2f859245bb6" (UID: "d0f0b912-33a5-4498-84cf-e2f859245bb6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.047854 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7796988564-hnmhv"] Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.053930 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f0b912-33a5-4498-84cf-e2f859245bb6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d0f0b912-33a5-4498-84cf-e2f859245bb6" (UID: "d0f0b912-33a5-4498-84cf-e2f859245bb6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:08 crc kubenswrapper[4992]: W0131 09:45:08.056948 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf09da6f2_c367_45e0_8293_e2b6a9b9df2c.slice/crio-4b511cabf05fadb1fec709049d22d4330e0fec2d2ebe8e778103ce5eb47c2a9d WatchSource:0}: Error finding container 4b511cabf05fadb1fec709049d22d4330e0fec2d2ebe8e778103ce5eb47c2a9d: Status 404 returned error can't find the container with id 4b511cabf05fadb1fec709049d22d4330e0fec2d2ebe8e778103ce5eb47c2a9d Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.057252 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f0b912-33a5-4498-84cf-e2f859245bb6-kube-api-access-sj229" (OuterVolumeSpecName: "kube-api-access-sj229") pod "d0f0b912-33a5-4498-84cf-e2f859245bb6" (UID: "d0f0b912-33a5-4498-84cf-e2f859245bb6"). InnerVolumeSpecName "kube-api-access-sj229". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.063706 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-69d4477cc6-bk8rk"] Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.147750 4992 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0f0b912-33a5-4498-84cf-e2f859245bb6-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.147783 4992 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0f0b912-33a5-4498-84cf-e2f859245bb6-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.147793 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj229\" (UniqueName: \"kubernetes.io/projected/d0f0b912-33a5-4498-84cf-e2f859245bb6-kube-api-access-sj229\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.181374 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7796988564-hnmhv" event={"ID":"f09da6f2-c367-45e0-8293-e2b6a9b9df2c","Type":"ContainerStarted","Data":"4b511cabf05fadb1fec709049d22d4330e0fec2d2ebe8e778103ce5eb47c2a9d"} Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.183660 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" event={"ID":"d0f0b912-33a5-4498-84cf-e2f859245bb6","Type":"ContainerDied","Data":"6e388c48a933bc8f3f6d81f9dc8a3cd00d8826fe2b11a5e7705a3e2531d46aff"} Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.183683 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e388c48a933bc8f3f6d81f9dc8a3cd00d8826fe2b11a5e7705a3e2531d46aff" Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.183715 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw" Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.188456 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9dxlh" event={"ID":"57f531e7-e05e-4537-bb22-01911330abd2","Type":"ContainerStarted","Data":"36c8234fe1065dca43d3dc49b6de769bc057e035a2564c2a249ff2eb5b982c7c"} Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.190501 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69d4477cc6-bk8rk" event={"ID":"1fdcaec9-5cd0-4117-bb84-413a80a5860c","Type":"ContainerStarted","Data":"800587f4aced5b2d899059735c504ca76bdab3abd022f9da1e4b366871f58417"} Jan 31 09:45:08 crc kubenswrapper[4992]: I0131 09:45:08.207768 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9dxlh" podStartSLOduration=2.957849984 podStartE2EDuration="45.207753572s" podCreationTimestamp="2026-01-31 09:44:23 +0000 UTC" firstStartedPulling="2026-01-31 09:44:25.150399437 +0000 UTC m=+1161.121791424" lastFinishedPulling="2026-01-31 09:45:07.400303005 +0000 UTC m=+1203.371695012" observedRunningTime="2026-01-31 09:45:08.2049004 +0000 UTC m=+1204.176292387" watchObservedRunningTime="2026-01-31 09:45:08.207753572 +0000 UTC m=+1204.179145559" Jan 31 09:45:09 crc kubenswrapper[4992]: I0131 09:45:09.210372 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69d4477cc6-bk8rk" event={"ID":"1fdcaec9-5cd0-4117-bb84-413a80a5860c","Type":"ContainerStarted","Data":"0e1384fc19becb01c538d3719e98871bbbcae8ff4c1bb20f7ea2744ce384560e"} Jan 31 09:45:09 crc kubenswrapper[4992]: I0131 09:45:09.210667 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69d4477cc6-bk8rk" event={"ID":"1fdcaec9-5cd0-4117-bb84-413a80a5860c","Type":"ContainerStarted","Data":"9bd66698e5a87bacc917d89d478ec05bf9373a06105c037624c62e4b14fe53bd"} Jan 31 09:45:09 crc kubenswrapper[4992]: I0131 09:45:09.211096 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:09 crc kubenswrapper[4992]: I0131 09:45:09.211153 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:09 crc kubenswrapper[4992]: I0131 09:45:09.215979 4992 generic.go:334] "Generic (PLEG): container finished" podID="02f6c85a-822e-4864-b0aa-1c487d73721c" containerID="2d5b2cb07e909d54987e1d03967060cd8eb2bb0c5bd742d43d89cf64e66516c8" exitCode=0 Jan 31 09:45:09 crc kubenswrapper[4992]: I0131 09:45:09.216392 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lbgq7" event={"ID":"02f6c85a-822e-4864-b0aa-1c487d73721c","Type":"ContainerDied","Data":"2d5b2cb07e909d54987e1d03967060cd8eb2bb0c5bd742d43d89cf64e66516c8"} Jan 31 09:45:09 crc kubenswrapper[4992]: I0131 09:45:09.219978 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7796988564-hnmhv" event={"ID":"f09da6f2-c367-45e0-8293-e2b6a9b9df2c","Type":"ContainerStarted","Data":"b84c8fd36997a7be89641b99b7dec81159b616d8ceaa5f304805615267e13a93"} Jan 31 09:45:09 crc kubenswrapper[4992]: I0131 09:45:09.220006 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7796988564-hnmhv" event={"ID":"f09da6f2-c367-45e0-8293-e2b6a9b9df2c","Type":"ContainerStarted","Data":"ada62a589c52efe8432dd6468198ce91f2ca7aaf591ab4f6009cc0fa545681ff"} Jan 31 09:45:09 crc kubenswrapper[4992]: I0131 09:45:09.220817 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:09 crc kubenswrapper[4992]: I0131 09:45:09.220848 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:09 crc kubenswrapper[4992]: I0131 09:45:09.243035 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-69d4477cc6-bk8rk" podStartSLOduration=4.243016245 podStartE2EDuration="4.243016245s" podCreationTimestamp="2026-01-31 09:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:45:09.227700134 +0000 UTC m=+1205.199092121" watchObservedRunningTime="2026-01-31 09:45:09.243016245 +0000 UTC m=+1205.214408232" Jan 31 09:45:09 crc kubenswrapper[4992]: I0131 09:45:09.274675 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7796988564-hnmhv" podStartSLOduration=4.274650117 podStartE2EDuration="4.274650117s" podCreationTimestamp="2026-01-31 09:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:45:09.255595478 +0000 UTC m=+1205.226987485" watchObservedRunningTime="2026-01-31 09:45:09.274650117 +0000 UTC m=+1205.246042124" Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.263901 4992 generic.go:334] "Generic (PLEG): container finished" podID="57f531e7-e05e-4537-bb22-01911330abd2" containerID="36c8234fe1065dca43d3dc49b6de769bc057e035a2564c2a249ff2eb5b982c7c" exitCode=0 Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.263987 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9dxlh" event={"ID":"57f531e7-e05e-4537-bb22-01911330abd2","Type":"ContainerDied","Data":"36c8234fe1065dca43d3dc49b6de769bc057e035a2564c2a249ff2eb5b982c7c"} Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.267474 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lbgq7" event={"ID":"02f6c85a-822e-4864-b0aa-1c487d73721c","Type":"ContainerDied","Data":"9b0e8b374ac3cc33c2c554d05fcc1e77d073585e970502c09a8ad9baff93c314"} Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.267604 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b0e8b374ac3cc33c2c554d05fcc1e77d073585e970502c09a8ad9baff93c314" Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.275774 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lbgq7" Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.439614 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f6c85a-822e-4864-b0aa-1c487d73721c-combined-ca-bundle\") pod \"02f6c85a-822e-4864-b0aa-1c487d73721c\" (UID: \"02f6c85a-822e-4864-b0aa-1c487d73721c\") " Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.439678 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwwzk\" (UniqueName: \"kubernetes.io/projected/02f6c85a-822e-4864-b0aa-1c487d73721c-kube-api-access-bwwzk\") pod \"02f6c85a-822e-4864-b0aa-1c487d73721c\" (UID: \"02f6c85a-822e-4864-b0aa-1c487d73721c\") " Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.439773 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/02f6c85a-822e-4864-b0aa-1c487d73721c-db-sync-config-data\") pod \"02f6c85a-822e-4864-b0aa-1c487d73721c\" (UID: \"02f6c85a-822e-4864-b0aa-1c487d73721c\") " Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.445145 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02f6c85a-822e-4864-b0aa-1c487d73721c-kube-api-access-bwwzk" (OuterVolumeSpecName: "kube-api-access-bwwzk") pod "02f6c85a-822e-4864-b0aa-1c487d73721c" (UID: "02f6c85a-822e-4864-b0aa-1c487d73721c"). InnerVolumeSpecName "kube-api-access-bwwzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.450057 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02f6c85a-822e-4864-b0aa-1c487d73721c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "02f6c85a-822e-4864-b0aa-1c487d73721c" (UID: "02f6c85a-822e-4864-b0aa-1c487d73721c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.489626 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02f6c85a-822e-4864-b0aa-1c487d73721c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02f6c85a-822e-4864-b0aa-1c487d73721c" (UID: "02f6c85a-822e-4864-b0aa-1c487d73721c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.541208 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f6c85a-822e-4864-b0aa-1c487d73721c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.541238 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwwzk\" (UniqueName: \"kubernetes.io/projected/02f6c85a-822e-4864-b0aa-1c487d73721c-kube-api-access-bwwzk\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.541249 4992 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/02f6c85a-822e-4864-b0aa-1c487d73721c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.831680 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-65f6bf6f54-x2b8z" podUID="04ff2a8b-a743-475e-9ae5-5fb98839ba57" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.140:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.140:8443: connect: connection refused" Jan 31 09:45:12 crc kubenswrapper[4992]: I0131 09:45:12.960294 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d6fc5dc84-n2kln" podUID="7ec5b46d-f009-46c5-a8a5-78b5b3afc50e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.141:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.141:8443: connect: connection refused" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.274442 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lbgq7" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.556501 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-c6bf9c9b7-9ntlg"] Jan 31 09:45:13 crc kubenswrapper[4992]: E0131 09:45:13.556913 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02f6c85a-822e-4864-b0aa-1c487d73721c" containerName="barbican-db-sync" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.556931 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="02f6c85a-822e-4864-b0aa-1c487d73721c" containerName="barbican-db-sync" Jan 31 09:45:13 crc kubenswrapper[4992]: E0131 09:45:13.556967 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f0b912-33a5-4498-84cf-e2f859245bb6" containerName="collect-profiles" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.556975 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f0b912-33a5-4498-84cf-e2f859245bb6" containerName="collect-profiles" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.557173 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="02f6c85a-822e-4864-b0aa-1c487d73721c" containerName="barbican-db-sync" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.557197 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f0b912-33a5-4498-84cf-e2f859245bb6" containerName="collect-profiles" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.558294 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.561753 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.568624 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.568853 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-brjft" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.587842 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f5ab71-2acb-484c-b6a8-51e447281183-config-data\") pod \"barbican-worker-c6bf9c9b7-9ntlg\" (UID: \"e2f5ab71-2acb-484c-b6a8-51e447281183\") " pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.587915 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tvc4\" (UniqueName: \"kubernetes.io/projected/e2f5ab71-2acb-484c-b6a8-51e447281183-kube-api-access-9tvc4\") pod \"barbican-worker-c6bf9c9b7-9ntlg\" (UID: \"e2f5ab71-2acb-484c-b6a8-51e447281183\") " pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.587949 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2f5ab71-2acb-484c-b6a8-51e447281183-logs\") pod \"barbican-worker-c6bf9c9b7-9ntlg\" (UID: \"e2f5ab71-2acb-484c-b6a8-51e447281183\") " pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.588023 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2f5ab71-2acb-484c-b6a8-51e447281183-config-data-custom\") pod \"barbican-worker-c6bf9c9b7-9ntlg\" (UID: \"e2f5ab71-2acb-484c-b6a8-51e447281183\") " pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.588058 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f5ab71-2acb-484c-b6a8-51e447281183-combined-ca-bundle\") pod \"barbican-worker-c6bf9c9b7-9ntlg\" (UID: \"e2f5ab71-2acb-484c-b6a8-51e447281183\") " pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.588354 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-c6bf9c9b7-9ntlg"] Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.608499 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5c454c9bdb-2cf9g"] Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.610387 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.614581 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.640550 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5c454c9bdb-2cf9g"] Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.689856 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2f5ab71-2acb-484c-b6a8-51e447281183-config-data-custom\") pod \"barbican-worker-c6bf9c9b7-9ntlg\" (UID: \"e2f5ab71-2acb-484c-b6a8-51e447281183\") " pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.689903 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f5ab71-2acb-484c-b6a8-51e447281183-combined-ca-bundle\") pod \"barbican-worker-c6bf9c9b7-9ntlg\" (UID: \"e2f5ab71-2acb-484c-b6a8-51e447281183\") " pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.689925 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2b51516-4104-4383-9312-e813d570ae69-config-data-custom\") pod \"barbican-keystone-listener-5c454c9bdb-2cf9g\" (UID: \"b2b51516-4104-4383-9312-e813d570ae69\") " pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.689980 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b51516-4104-4383-9312-e813d570ae69-combined-ca-bundle\") pod \"barbican-keystone-listener-5c454c9bdb-2cf9g\" (UID: \"b2b51516-4104-4383-9312-e813d570ae69\") " pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.690012 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f5ab71-2acb-484c-b6a8-51e447281183-config-data\") pod \"barbican-worker-c6bf9c9b7-9ntlg\" (UID: \"e2f5ab71-2acb-484c-b6a8-51e447281183\") " pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.690039 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tvc4\" (UniqueName: \"kubernetes.io/projected/e2f5ab71-2acb-484c-b6a8-51e447281183-kube-api-access-9tvc4\") pod \"barbican-worker-c6bf9c9b7-9ntlg\" (UID: \"e2f5ab71-2acb-484c-b6a8-51e447281183\") " pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.690063 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b51516-4104-4383-9312-e813d570ae69-config-data\") pod \"barbican-keystone-listener-5c454c9bdb-2cf9g\" (UID: \"b2b51516-4104-4383-9312-e813d570ae69\") " pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.690081 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2f5ab71-2acb-484c-b6a8-51e447281183-logs\") pod \"barbican-worker-c6bf9c9b7-9ntlg\" (UID: \"e2f5ab71-2acb-484c-b6a8-51e447281183\") " pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.690100 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2b51516-4104-4383-9312-e813d570ae69-logs\") pod \"barbican-keystone-listener-5c454c9bdb-2cf9g\" (UID: \"b2b51516-4104-4383-9312-e813d570ae69\") " pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.690140 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nskpl\" (UniqueName: \"kubernetes.io/projected/b2b51516-4104-4383-9312-e813d570ae69-kube-api-access-nskpl\") pod \"barbican-keystone-listener-5c454c9bdb-2cf9g\" (UID: \"b2b51516-4104-4383-9312-e813d570ae69\") " pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.695879 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2f5ab71-2acb-484c-b6a8-51e447281183-logs\") pod \"barbican-worker-c6bf9c9b7-9ntlg\" (UID: \"e2f5ab71-2acb-484c-b6a8-51e447281183\") " pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.700026 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f5ab71-2acb-484c-b6a8-51e447281183-combined-ca-bundle\") pod \"barbican-worker-c6bf9c9b7-9ntlg\" (UID: \"e2f5ab71-2acb-484c-b6a8-51e447281183\") " pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.700358 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f5ab71-2acb-484c-b6a8-51e447281183-config-data\") pod \"barbican-worker-c6bf9c9b7-9ntlg\" (UID: \"e2f5ab71-2acb-484c-b6a8-51e447281183\") " pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.710018 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869f779d85-x4kvm"] Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.711697 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.711844 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2f5ab71-2acb-484c-b6a8-51e447281183-config-data-custom\") pod \"barbican-worker-c6bf9c9b7-9ntlg\" (UID: \"e2f5ab71-2acb-484c-b6a8-51e447281183\") " pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.724348 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-x4kvm"] Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.732082 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tvc4\" (UniqueName: \"kubernetes.io/projected/e2f5ab71-2acb-484c-b6a8-51e447281183-kube-api-access-9tvc4\") pod \"barbican-worker-c6bf9c9b7-9ntlg\" (UID: \"e2f5ab71-2acb-484c-b6a8-51e447281183\") " pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.788102 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-76ff5b49fd-jljf2"] Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.790204 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.793640 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2b51516-4104-4383-9312-e813d570ae69-config-data-custom\") pod \"barbican-keystone-listener-5c454c9bdb-2cf9g\" (UID: \"b2b51516-4104-4383-9312-e813d570ae69\") " pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.793686 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b51516-4104-4383-9312-e813d570ae69-combined-ca-bundle\") pod \"barbican-keystone-listener-5c454c9bdb-2cf9g\" (UID: \"b2b51516-4104-4383-9312-e813d570ae69\") " pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.793743 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b51516-4104-4383-9312-e813d570ae69-config-data\") pod \"barbican-keystone-listener-5c454c9bdb-2cf9g\" (UID: \"b2b51516-4104-4383-9312-e813d570ae69\") " pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.793769 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2b51516-4104-4383-9312-e813d570ae69-logs\") pod \"barbican-keystone-listener-5c454c9bdb-2cf9g\" (UID: \"b2b51516-4104-4383-9312-e813d570ae69\") " pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.793807 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nskpl\" (UniqueName: \"kubernetes.io/projected/b2b51516-4104-4383-9312-e813d570ae69-kube-api-access-nskpl\") pod \"barbican-keystone-listener-5c454c9bdb-2cf9g\" (UID: \"b2b51516-4104-4383-9312-e813d570ae69\") " pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.798083 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.798983 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2b51516-4104-4383-9312-e813d570ae69-logs\") pod \"barbican-keystone-listener-5c454c9bdb-2cf9g\" (UID: \"b2b51516-4104-4383-9312-e813d570ae69\") " pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.809174 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2b51516-4104-4383-9312-e813d570ae69-config-data-custom\") pod \"barbican-keystone-listener-5c454c9bdb-2cf9g\" (UID: \"b2b51516-4104-4383-9312-e813d570ae69\") " pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.810129 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76ff5b49fd-jljf2"] Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.812197 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b51516-4104-4383-9312-e813d570ae69-config-data\") pod \"barbican-keystone-listener-5c454c9bdb-2cf9g\" (UID: \"b2b51516-4104-4383-9312-e813d570ae69\") " pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.813904 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b51516-4104-4383-9312-e813d570ae69-combined-ca-bundle\") pod \"barbican-keystone-listener-5c454c9bdb-2cf9g\" (UID: \"b2b51516-4104-4383-9312-e813d570ae69\") " pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.820653 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nskpl\" (UniqueName: \"kubernetes.io/projected/b2b51516-4104-4383-9312-e813d570ae69-kube-api-access-nskpl\") pod \"barbican-keystone-listener-5c454c9bdb-2cf9g\" (UID: \"b2b51516-4104-4383-9312-e813d570ae69\") " pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.898308 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-x4kvm\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.898371 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-dns-svc\") pod \"dnsmasq-dns-869f779d85-x4kvm\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.898431 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-config-data-custom\") pod \"barbican-api-76ff5b49fd-jljf2\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.898456 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a39bd8-edb8-4744-bb08-2fc44760608a-logs\") pod \"barbican-api-76ff5b49fd-jljf2\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.898504 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-config\") pod \"dnsmasq-dns-869f779d85-x4kvm\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.898540 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89xxq\" (UniqueName: \"kubernetes.io/projected/57a39bd8-edb8-4744-bb08-2fc44760608a-kube-api-access-89xxq\") pod \"barbican-api-76ff5b49fd-jljf2\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.898567 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-combined-ca-bundle\") pod \"barbican-api-76ff5b49fd-jljf2\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.898592 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-config-data\") pod \"barbican-api-76ff5b49fd-jljf2\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.898664 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-x4kvm\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.898695 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv6xh\" (UniqueName: \"kubernetes.io/projected/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-kube-api-access-rv6xh\") pod \"dnsmasq-dns-869f779d85-x4kvm\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.908229 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.942982 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" Jan 31 09:45:13 crc kubenswrapper[4992]: I0131 09:45:13.959502 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:13.999916 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-config-data-custom\") pod \"barbican-api-76ff5b49fd-jljf2\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:13.999977 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a39bd8-edb8-4744-bb08-2fc44760608a-logs\") pod \"barbican-api-76ff5b49fd-jljf2\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.000036 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-config\") pod \"dnsmasq-dns-869f779d85-x4kvm\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.000072 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89xxq\" (UniqueName: \"kubernetes.io/projected/57a39bd8-edb8-4744-bb08-2fc44760608a-kube-api-access-89xxq\") pod \"barbican-api-76ff5b49fd-jljf2\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.000104 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-combined-ca-bundle\") pod \"barbican-api-76ff5b49fd-jljf2\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.000129 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-config-data\") pod \"barbican-api-76ff5b49fd-jljf2\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.000204 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-x4kvm\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.000229 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv6xh\" (UniqueName: \"kubernetes.io/projected/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-kube-api-access-rv6xh\") pod \"dnsmasq-dns-869f779d85-x4kvm\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.000284 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-x4kvm\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.000310 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-dns-svc\") pod \"dnsmasq-dns-869f779d85-x4kvm\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.001383 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-dns-svc\") pod \"dnsmasq-dns-869f779d85-x4kvm\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.003089 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-x4kvm\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.003840 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-config\") pod \"dnsmasq-dns-869f779d85-x4kvm\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.004125 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a39bd8-edb8-4744-bb08-2fc44760608a-logs\") pod \"barbican-api-76ff5b49fd-jljf2\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.005363 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-x4kvm\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.006530 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-config-data-custom\") pod \"barbican-api-76ff5b49fd-jljf2\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.008046 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-combined-ca-bundle\") pod \"barbican-api-76ff5b49fd-jljf2\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.018993 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv6xh\" (UniqueName: \"kubernetes.io/projected/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-kube-api-access-rv6xh\") pod \"dnsmasq-dns-869f779d85-x4kvm\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.019173 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-config-data\") pod \"barbican-api-76ff5b49fd-jljf2\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.023990 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89xxq\" (UniqueName: \"kubernetes.io/projected/57a39bd8-edb8-4744-bb08-2fc44760608a-kube-api-access-89xxq\") pod \"barbican-api-76ff5b49fd-jljf2\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.101179 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-combined-ca-bundle\") pod \"57f531e7-e05e-4537-bb22-01911330abd2\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.101242 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-db-sync-config-data\") pod \"57f531e7-e05e-4537-bb22-01911330abd2\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.101329 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-scripts\") pod \"57f531e7-e05e-4537-bb22-01911330abd2\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.101436 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57f531e7-e05e-4537-bb22-01911330abd2-etc-machine-id\") pod \"57f531e7-e05e-4537-bb22-01911330abd2\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.101472 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhwkv\" (UniqueName: \"kubernetes.io/projected/57f531e7-e05e-4537-bb22-01911330abd2-kube-api-access-bhwkv\") pod \"57f531e7-e05e-4537-bb22-01911330abd2\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.101516 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-config-data\") pod \"57f531e7-e05e-4537-bb22-01911330abd2\" (UID: \"57f531e7-e05e-4537-bb22-01911330abd2\") " Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.109193 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57f531e7-e05e-4537-bb22-01911330abd2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "57f531e7-e05e-4537-bb22-01911330abd2" (UID: "57f531e7-e05e-4537-bb22-01911330abd2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.115340 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "57f531e7-e05e-4537-bb22-01911330abd2" (UID: "57f531e7-e05e-4537-bb22-01911330abd2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.115484 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f531e7-e05e-4537-bb22-01911330abd2-kube-api-access-bhwkv" (OuterVolumeSpecName: "kube-api-access-bhwkv") pod "57f531e7-e05e-4537-bb22-01911330abd2" (UID: "57f531e7-e05e-4537-bb22-01911330abd2"). InnerVolumeSpecName "kube-api-access-bhwkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.115644 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-scripts" (OuterVolumeSpecName: "scripts") pod "57f531e7-e05e-4537-bb22-01911330abd2" (UID: "57f531e7-e05e-4537-bb22-01911330abd2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.149426 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57f531e7-e05e-4537-bb22-01911330abd2" (UID: "57f531e7-e05e-4537-bb22-01911330abd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.157617 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-config-data" (OuterVolumeSpecName: "config-data") pod "57f531e7-e05e-4537-bb22-01911330abd2" (UID: "57f531e7-e05e-4537-bb22-01911330abd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.197925 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.203957 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.203995 4992 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57f531e7-e05e-4537-bb22-01911330abd2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.204009 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhwkv\" (UniqueName: \"kubernetes.io/projected/57f531e7-e05e-4537-bb22-01911330abd2-kube-api-access-bhwkv\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.204019 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.204031 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.204042 4992 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/57f531e7-e05e-4537-bb22-01911330abd2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.216748 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.287833 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9dxlh" event={"ID":"57f531e7-e05e-4537-bb22-01911330abd2","Type":"ContainerDied","Data":"637a267f6f098ba2530c0424a5d8fbb123eb5ed1503b83373f6efad7fe174ca9"} Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.287878 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="637a267f6f098ba2530c0424a5d8fbb123eb5ed1503b83373f6efad7fe174ca9" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.287940 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9dxlh" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.647975 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:45:14 crc kubenswrapper[4992]: E0131 09:45:14.648974 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f531e7-e05e-4537-bb22-01911330abd2" containerName="cinder-db-sync" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.648993 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f531e7-e05e-4537-bb22-01911330abd2" containerName="cinder-db-sync" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.649288 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f531e7-e05e-4537-bb22-01911330abd2" containerName="cinder-db-sync" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.650663 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.659476 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-h4h6m" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.659568 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.659676 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.659893 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.733503 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.764059 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-x4kvm"] Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.817042 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.817122 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-scripts\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.817155 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75fzw\" (UniqueName: \"kubernetes.io/projected/a7852c78-5488-4147-8160-f521bcdb5075-kube-api-access-75fzw\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.817223 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7852c78-5488-4147-8160-f521bcdb5075-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.817353 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-config-data\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.817382 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.839481 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-psm5z"] Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.841170 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.851182 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-psm5z"] Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.918588 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7852c78-5488-4147-8160-f521bcdb5075-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.918909 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-config-data\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.919118 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.919158 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-psm5z\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.919349 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-psm5z\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.919376 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.919410 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrgqs\" (UniqueName: \"kubernetes.io/projected/5ae0611b-2e81-4f74-9acd-b52769aadd42-kube-api-access-lrgqs\") pod \"dnsmasq-dns-58db5546cc-psm5z\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.919443 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-scripts\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.919471 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-dns-svc\") pod \"dnsmasq-dns-58db5546cc-psm5z\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.919493 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75fzw\" (UniqueName: \"kubernetes.io/projected/a7852c78-5488-4147-8160-f521bcdb5075-kube-api-access-75fzw\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.919525 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-config\") pod \"dnsmasq-dns-58db5546cc-psm5z\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.919627 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7852c78-5488-4147-8160-f521bcdb5075-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.919908 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.921901 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.925195 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-scripts\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.928246 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.935974 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.939847 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.948396 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-config-data\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.951587 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75fzw\" (UniqueName: \"kubernetes.io/projected/a7852c78-5488-4147-8160-f521bcdb5075-kube-api-access-75fzw\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.952620 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-c6bf9c9b7-9ntlg"] Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.970189 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:14 crc kubenswrapper[4992]: I0131 09:45:14.994336 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.021064 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-config-data\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.021163 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-psm5z\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.021201 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d14b8bc4-65b8-4dce-aa68-0362de63be9d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.021216 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d14b8bc4-65b8-4dce-aa68-0362de63be9d-logs\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.021249 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-psm5z\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.021450 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-scripts\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.021481 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrgqs\" (UniqueName: \"kubernetes.io/projected/5ae0611b-2e81-4f74-9acd-b52769aadd42-kube-api-access-lrgqs\") pod \"dnsmasq-dns-58db5546cc-psm5z\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.021505 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6w9s\" (UniqueName: \"kubernetes.io/projected/d14b8bc4-65b8-4dce-aa68-0362de63be9d-kube-api-access-d6w9s\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.021527 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-dns-svc\") pod \"dnsmasq-dns-58db5546cc-psm5z\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.021559 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.021575 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-config-data-custom\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.021598 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-config\") pod \"dnsmasq-dns-58db5546cc-psm5z\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.022297 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-config\") pod \"dnsmasq-dns-58db5546cc-psm5z\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.022321 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-psm5z\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.023009 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-dns-svc\") pod \"dnsmasq-dns-58db5546cc-psm5z\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.023135 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-psm5z\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.045596 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrgqs\" (UniqueName: \"kubernetes.io/projected/5ae0611b-2e81-4f74-9acd-b52769aadd42-kube-api-access-lrgqs\") pod \"dnsmasq-dns-58db5546cc-psm5z\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.134495 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-config-data\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.134654 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d14b8bc4-65b8-4dce-aa68-0362de63be9d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.134697 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d14b8bc4-65b8-4dce-aa68-0362de63be9d-logs\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.134754 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d14b8bc4-65b8-4dce-aa68-0362de63be9d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.134785 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-scripts\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.134856 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6w9s\" (UniqueName: \"kubernetes.io/projected/d14b8bc4-65b8-4dce-aa68-0362de63be9d-kube-api-access-d6w9s\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.134932 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.134947 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-config-data-custom\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.135062 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d14b8bc4-65b8-4dce-aa68-0362de63be9d-logs\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.138494 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-scripts\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.139635 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.140283 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-config-data\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.147171 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-config-data-custom\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.154727 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6w9s\" (UniqueName: \"kubernetes.io/projected/d14b8bc4-65b8-4dce-aa68-0362de63be9d-kube-api-access-d6w9s\") pod \"cinder-api-0\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.183396 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.223189 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-x4kvm"] Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.301571 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.301634 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.312557 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" event={"ID":"e2f5ab71-2acb-484c-b6a8-51e447281183","Type":"ContainerStarted","Data":"566d2ea7d9aedaa25381b26960fd72cf75a48ab9e3f12b7bcadf11b052bff0c1"} Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.333994 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-x4kvm" event={"ID":"bb03cbe4-dd01-4ffb-a581-8d16b41c2102","Type":"ContainerStarted","Data":"a2b8126ff12a5006d6fe0eada7925d581620f32fe89d71b5e3b144264fcdbe0a"} Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.350341 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5c454c9bdb-2cf9g"] Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.360892 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"014742b0-d197-4f2d-9186-3fb1daa4318e","Type":"ContainerStarted","Data":"df630815df72f8587ad3b6d057ff48adb51c66196406bd535e51e1081c241e6d"} Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.361043 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="ceilometer-central-agent" containerID="cri-o://20fa91fbba6cc74e59aebc509d7cbb6437d5c60aadc4238709bd45e2c190fe78" gracePeriod=30 Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.361110 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.361448 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="proxy-httpd" containerID="cri-o://df630815df72f8587ad3b6d057ff48adb51c66196406bd535e51e1081c241e6d" gracePeriod=30 Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.361503 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="sg-core" containerID="cri-o://7b937a4004691f2f165084eb4019197a06c9318f767b4288d38811fb1be2cf24" gracePeriod=30 Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.361527 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="ceilometer-notification-agent" containerID="cri-o://06b73264b209677c52611b4cddf3bccca55e8e0c96743ceca7fdff048ee2011d" gracePeriod=30 Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.374149 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76ff5b49fd-jljf2"] Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.388716 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.076019739 podStartE2EDuration="51.388698003s" podCreationTimestamp="2026-01-31 09:44:24 +0000 UTC" firstStartedPulling="2026-01-31 09:44:25.624998103 +0000 UTC m=+1161.596390090" lastFinishedPulling="2026-01-31 09:45:14.937676367 +0000 UTC m=+1210.909068354" observedRunningTime="2026-01-31 09:45:15.388285651 +0000 UTC m=+1211.359677658" watchObservedRunningTime="2026-01-31 09:45:15.388698003 +0000 UTC m=+1211.360089990" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.398644 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.568664 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.805068 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-psm5z"] Jan 31 09:45:15 crc kubenswrapper[4992]: I0131 09:45:15.967449 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.384619 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" event={"ID":"b2b51516-4104-4383-9312-e813d570ae69","Type":"ContainerStarted","Data":"affaad78f97555bde3fd04268c33d087cf94af3559d1eed9d219ba493140eb5b"} Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.405497 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76ff5b49fd-jljf2" event={"ID":"57a39bd8-edb8-4744-bb08-2fc44760608a","Type":"ContainerStarted","Data":"d5505953e505e2daa4904c4a87c2192e7dada42b708ee67c12c721cb3675d953"} Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.405563 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76ff5b49fd-jljf2" event={"ID":"57a39bd8-edb8-4744-bb08-2fc44760608a","Type":"ContainerStarted","Data":"2a8594eb86aee7b4010ece44360e831016a772f6c8b4c3784e6e296719b93825"} Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.405576 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76ff5b49fd-jljf2" event={"ID":"57a39bd8-edb8-4744-bb08-2fc44760608a","Type":"ContainerStarted","Data":"f685a2e72e97238d382e62230f5c20046511adfbdbe31f29700e1d2d03108501"} Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.405669 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.405862 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.446160 4992 generic.go:334] "Generic (PLEG): container finished" podID="5ae0611b-2e81-4f74-9acd-b52769aadd42" containerID="6d938c484e6def42119b72f5b2bc79c6223d33f1c2b5107158025f9ea22a007e" exitCode=0 Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.446308 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-psm5z" event={"ID":"5ae0611b-2e81-4f74-9acd-b52769aadd42","Type":"ContainerDied","Data":"6d938c484e6def42119b72f5b2bc79c6223d33f1c2b5107158025f9ea22a007e"} Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.446362 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-psm5z" event={"ID":"5ae0611b-2e81-4f74-9acd-b52769aadd42","Type":"ContainerStarted","Data":"d728275de79f33bf97691fe37a91d54baba3bc1623cc0f5f13bab36c7b1eee0e"} Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.486033 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d14b8bc4-65b8-4dce-aa68-0362de63be9d","Type":"ContainerStarted","Data":"918e3ac5bf5d0157764d7335d08dc602e688f9259408e62a9030d67429b25042"} Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.498309 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-76ff5b49fd-jljf2" podStartSLOduration=3.498286528 podStartE2EDuration="3.498286528s" podCreationTimestamp="2026-01-31 09:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:45:16.442811689 +0000 UTC m=+1212.414203696" watchObservedRunningTime="2026-01-31 09:45:16.498286528 +0000 UTC m=+1212.469678525" Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.500398 4992 generic.go:334] "Generic (PLEG): container finished" podID="bb03cbe4-dd01-4ffb-a581-8d16b41c2102" containerID="a86a3dfb5ba039ec70af1f42386c20c299b3f41f50e2b955ffbeb4dba20b3c85" exitCode=0 Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.500739 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-x4kvm" event={"ID":"bb03cbe4-dd01-4ffb-a581-8d16b41c2102","Type":"ContainerDied","Data":"a86a3dfb5ba039ec70af1f42386c20c299b3f41f50e2b955ffbeb4dba20b3c85"} Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.508442 4992 generic.go:334] "Generic (PLEG): container finished" podID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerID="df630815df72f8587ad3b6d057ff48adb51c66196406bd535e51e1081c241e6d" exitCode=0 Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.508474 4992 generic.go:334] "Generic (PLEG): container finished" podID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerID="7b937a4004691f2f165084eb4019197a06c9318f767b4288d38811fb1be2cf24" exitCode=2 Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.508484 4992 generic.go:334] "Generic (PLEG): container finished" podID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerID="20fa91fbba6cc74e59aebc509d7cbb6437d5c60aadc4238709bd45e2c190fe78" exitCode=0 Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.508536 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"014742b0-d197-4f2d-9186-3fb1daa4318e","Type":"ContainerDied","Data":"df630815df72f8587ad3b6d057ff48adb51c66196406bd535e51e1081c241e6d"} Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.508563 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"014742b0-d197-4f2d-9186-3fb1daa4318e","Type":"ContainerDied","Data":"7b937a4004691f2f165084eb4019197a06c9318f767b4288d38811fb1be2cf24"} Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.508574 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"014742b0-d197-4f2d-9186-3fb1daa4318e","Type":"ContainerDied","Data":"20fa91fbba6cc74e59aebc509d7cbb6437d5c60aadc4238709bd45e2c190fe78"} Jan 31 09:45:16 crc kubenswrapper[4992]: I0131 09:45:16.509732 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a7852c78-5488-4147-8160-f521bcdb5075","Type":"ContainerStarted","Data":"5b3e0963f9591945cf42c788604bbeb2d5a2d47c31554d71a4c8d7b9c907912b"} Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.135765 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.301477 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-ovsdbserver-nb\") pod \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.301578 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-config\") pod \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.301616 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-ovsdbserver-sb\") pod \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.301702 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv6xh\" (UniqueName: \"kubernetes.io/projected/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-kube-api-access-rv6xh\") pod \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.301798 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-dns-svc\") pod \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\" (UID: \"bb03cbe4-dd01-4ffb-a581-8d16b41c2102\") " Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.332734 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.333141 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb03cbe4-dd01-4ffb-a581-8d16b41c2102" (UID: "bb03cbe4-dd01-4ffb-a581-8d16b41c2102"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.348578 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb03cbe4-dd01-4ffb-a581-8d16b41c2102" (UID: "bb03cbe4-dd01-4ffb-a581-8d16b41c2102"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.356785 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-kube-api-access-rv6xh" (OuterVolumeSpecName: "kube-api-access-rv6xh") pod "bb03cbe4-dd01-4ffb-a581-8d16b41c2102" (UID: "bb03cbe4-dd01-4ffb-a581-8d16b41c2102"). InnerVolumeSpecName "kube-api-access-rv6xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.362059 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb03cbe4-dd01-4ffb-a581-8d16b41c2102" (UID: "bb03cbe4-dd01-4ffb-a581-8d16b41c2102"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.385470 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-config" (OuterVolumeSpecName: "config") pod "bb03cbe4-dd01-4ffb-a581-8d16b41c2102" (UID: "bb03cbe4-dd01-4ffb-a581-8d16b41c2102"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.404875 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.404906 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.405039 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.405137 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv6xh\" (UniqueName: \"kubernetes.io/projected/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-kube-api-access-rv6xh\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.405150 4992 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb03cbe4-dd01-4ffb-a581-8d16b41c2102-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.519340 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d14b8bc4-65b8-4dce-aa68-0362de63be9d","Type":"ContainerStarted","Data":"df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73"} Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.521039 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-x4kvm" event={"ID":"bb03cbe4-dd01-4ffb-a581-8d16b41c2102","Type":"ContainerDied","Data":"a2b8126ff12a5006d6fe0eada7925d581620f32fe89d71b5e3b144264fcdbe0a"} Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.521084 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-x4kvm" Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.521091 4992 scope.go:117] "RemoveContainer" containerID="a86a3dfb5ba039ec70af1f42386c20c299b3f41f50e2b955ffbeb4dba20b3c85" Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.585510 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-x4kvm"] Jan 31 09:45:17 crc kubenswrapper[4992]: I0131 09:45:17.593102 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-x4kvm"] Jan 31 09:45:18 crc kubenswrapper[4992]: I0131 09:45:18.535872 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d14b8bc4-65b8-4dce-aa68-0362de63be9d","Type":"ContainerStarted","Data":"0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65"} Jan 31 09:45:18 crc kubenswrapper[4992]: I0131 09:45:18.536359 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 31 09:45:18 crc kubenswrapper[4992]: I0131 09:45:18.536322 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d14b8bc4-65b8-4dce-aa68-0362de63be9d" containerName="cinder-api" containerID="cri-o://0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65" gracePeriod=30 Jan 31 09:45:18 crc kubenswrapper[4992]: I0131 09:45:18.535918 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d14b8bc4-65b8-4dce-aa68-0362de63be9d" containerName="cinder-api-log" containerID="cri-o://df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73" gracePeriod=30 Jan 31 09:45:18 crc kubenswrapper[4992]: I0131 09:45:18.547364 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a7852c78-5488-4147-8160-f521bcdb5075","Type":"ContainerStarted","Data":"7b3b8313f535d16404f9acdded4ef888d1ac328000359cccc564a1ca4e6a5807"} Jan 31 09:45:18 crc kubenswrapper[4992]: I0131 09:45:18.550640 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" event={"ID":"b2b51516-4104-4383-9312-e813d570ae69","Type":"ContainerStarted","Data":"72ed1a6442bceea01563c0e61d23d36de26cd3f176bb1056142d770f8a01fba2"} Jan 31 09:45:18 crc kubenswrapper[4992]: I0131 09:45:18.550678 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" event={"ID":"b2b51516-4104-4383-9312-e813d570ae69","Type":"ContainerStarted","Data":"5d7d65a9f20fb8b53a9f55ab01d2ee03beeddadc0cf50778c5d044bf7a27b67e"} Jan 31 09:45:18 crc kubenswrapper[4992]: I0131 09:45:18.555948 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.555935093 podStartE2EDuration="4.555935093s" podCreationTimestamp="2026-01-31 09:45:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:45:18.555380687 +0000 UTC m=+1214.526772674" watchObservedRunningTime="2026-01-31 09:45:18.555935093 +0000 UTC m=+1214.527327080" Jan 31 09:45:18 crc kubenswrapper[4992]: I0131 09:45:18.567529 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-psm5z" event={"ID":"5ae0611b-2e81-4f74-9acd-b52769aadd42","Type":"ContainerStarted","Data":"8259faf3193d7dcf85bd42def3fe64eeb0fb5a1232744a428e679080fea37690"} Jan 31 09:45:18 crc kubenswrapper[4992]: I0131 09:45:18.567685 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:18 crc kubenswrapper[4992]: I0131 09:45:18.572401 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" event={"ID":"e2f5ab71-2acb-484c-b6a8-51e447281183","Type":"ContainerStarted","Data":"7c2938aeae34159729ec97d032c555a4d047e41b6f81058cb4659999aee7c1f7"} Jan 31 09:45:18 crc kubenswrapper[4992]: I0131 09:45:18.572508 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" event={"ID":"e2f5ab71-2acb-484c-b6a8-51e447281183","Type":"ContainerStarted","Data":"e4a288896f1d9a46145df36df814dab70347bb4a39f4aa0827c1a35c4f10b6a8"} Jan 31 09:45:18 crc kubenswrapper[4992]: I0131 09:45:18.579700 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5c454c9bdb-2cf9g" podStartSLOduration=3.295137022 podStartE2EDuration="5.579682346s" podCreationTimestamp="2026-01-31 09:45:13 +0000 UTC" firstStartedPulling="2026-01-31 09:45:15.349100511 +0000 UTC m=+1211.320492498" lastFinishedPulling="2026-01-31 09:45:17.633645835 +0000 UTC m=+1213.605037822" observedRunningTime="2026-01-31 09:45:18.574993011 +0000 UTC m=+1214.546385008" watchObservedRunningTime="2026-01-31 09:45:18.579682346 +0000 UTC m=+1214.551074333" Jan 31 09:45:18 crc kubenswrapper[4992]: I0131 09:45:18.604985 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58db5546cc-psm5z" podStartSLOduration=4.604967825 podStartE2EDuration="4.604967825s" podCreationTimestamp="2026-01-31 09:45:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:45:18.595015658 +0000 UTC m=+1214.566407655" watchObservedRunningTime="2026-01-31 09:45:18.604967825 +0000 UTC m=+1214.576359812" Jan 31 09:45:18 crc kubenswrapper[4992]: I0131 09:45:18.611524 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-c6bf9c9b7-9ntlg" podStartSLOduration=2.984556193 podStartE2EDuration="5.611511123s" podCreationTimestamp="2026-01-31 09:45:13 +0000 UTC" firstStartedPulling="2026-01-31 09:45:15.006670104 +0000 UTC m=+1210.978062091" lastFinishedPulling="2026-01-31 09:45:17.633625034 +0000 UTC m=+1213.605017021" observedRunningTime="2026-01-31 09:45:18.609248958 +0000 UTC m=+1214.580640945" watchObservedRunningTime="2026-01-31 09:45:18.611511123 +0000 UTC m=+1214.582903110" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.193703 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb03cbe4-dd01-4ffb-a581-8d16b41c2102" path="/var/lib/kubelet/pods/bb03cbe4-dd01-4ffb-a581-8d16b41c2102/volumes" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.544960 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.562604 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-scripts\") pod \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.562657 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-config-data-custom\") pod \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.562691 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6w9s\" (UniqueName: \"kubernetes.io/projected/d14b8bc4-65b8-4dce-aa68-0362de63be9d-kube-api-access-d6w9s\") pod \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.562727 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-config-data\") pod \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.562760 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d14b8bc4-65b8-4dce-aa68-0362de63be9d-logs\") pod \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.562849 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d14b8bc4-65b8-4dce-aa68-0362de63be9d-etc-machine-id\") pod \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.562873 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-combined-ca-bundle\") pod \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\" (UID: \"d14b8bc4-65b8-4dce-aa68-0362de63be9d\") " Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.568446 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-scripts" (OuterVolumeSpecName: "scripts") pod "d14b8bc4-65b8-4dce-aa68-0362de63be9d" (UID: "d14b8bc4-65b8-4dce-aa68-0362de63be9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.568695 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d14b8bc4-65b8-4dce-aa68-0362de63be9d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d14b8bc4-65b8-4dce-aa68-0362de63be9d" (UID: "d14b8bc4-65b8-4dce-aa68-0362de63be9d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.568796 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d14b8bc4-65b8-4dce-aa68-0362de63be9d-logs" (OuterVolumeSpecName: "logs") pod "d14b8bc4-65b8-4dce-aa68-0362de63be9d" (UID: "d14b8bc4-65b8-4dce-aa68-0362de63be9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.573997 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d14b8bc4-65b8-4dce-aa68-0362de63be9d-kube-api-access-d6w9s" (OuterVolumeSpecName: "kube-api-access-d6w9s") pod "d14b8bc4-65b8-4dce-aa68-0362de63be9d" (UID: "d14b8bc4-65b8-4dce-aa68-0362de63be9d"). InnerVolumeSpecName "kube-api-access-d6w9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.603655 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d14b8bc4-65b8-4dce-aa68-0362de63be9d" (UID: "d14b8bc4-65b8-4dce-aa68-0362de63be9d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.611851 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d14b8bc4-65b8-4dce-aa68-0362de63be9d" (UID: "d14b8bc4-65b8-4dce-aa68-0362de63be9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.623488 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.630353 4992 generic.go:334] "Generic (PLEG): container finished" podID="d14b8bc4-65b8-4dce-aa68-0362de63be9d" containerID="0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65" exitCode=0 Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.630393 4992 generic.go:334] "Generic (PLEG): container finished" podID="d14b8bc4-65b8-4dce-aa68-0362de63be9d" containerID="df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73" exitCode=143 Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.630453 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d14b8bc4-65b8-4dce-aa68-0362de63be9d","Type":"ContainerDied","Data":"0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65"} Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.630478 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d14b8bc4-65b8-4dce-aa68-0362de63be9d","Type":"ContainerDied","Data":"df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73"} Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.630488 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d14b8bc4-65b8-4dce-aa68-0362de63be9d","Type":"ContainerDied","Data":"918e3ac5bf5d0157764d7335d08dc602e688f9259408e62a9030d67429b25042"} Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.630502 4992 scope.go:117] "RemoveContainer" containerID="0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.630613 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.636706 4992 generic.go:334] "Generic (PLEG): container finished" podID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerID="06b73264b209677c52611b4cddf3bccca55e8e0c96743ceca7fdff048ee2011d" exitCode=0 Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.636779 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"014742b0-d197-4f2d-9186-3fb1daa4318e","Type":"ContainerDied","Data":"06b73264b209677c52611b4cddf3bccca55e8e0c96743ceca7fdff048ee2011d"} Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.636824 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.646475 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a7852c78-5488-4147-8160-f521bcdb5075","Type":"ContainerStarted","Data":"50cad3134a8b08211c079d64aa6d01aa9ba5fcc68f7b5bc3a122cb9f4fee20c6"} Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.662686 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-config-data" (OuterVolumeSpecName: "config-data") pod "d14b8bc4-65b8-4dce-aa68-0362de63be9d" (UID: "d14b8bc4-65b8-4dce-aa68-0362de63be9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.664683 4992 scope.go:117] "RemoveContainer" containerID="df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.666254 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014742b0-d197-4f2d-9186-3fb1daa4318e-run-httpd\") pod \"014742b0-d197-4f2d-9186-3fb1daa4318e\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.666329 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drjzg\" (UniqueName: \"kubernetes.io/projected/014742b0-d197-4f2d-9186-3fb1daa4318e-kube-api-access-drjzg\") pod \"014742b0-d197-4f2d-9186-3fb1daa4318e\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.666489 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-combined-ca-bundle\") pod \"014742b0-d197-4f2d-9186-3fb1daa4318e\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.666548 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-config-data\") pod \"014742b0-d197-4f2d-9186-3fb1daa4318e\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.666655 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-scripts\") pod \"014742b0-d197-4f2d-9186-3fb1daa4318e\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.666705 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-sg-core-conf-yaml\") pod \"014742b0-d197-4f2d-9186-3fb1daa4318e\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.666841 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/014742b0-d197-4f2d-9186-3fb1daa4318e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "014742b0-d197-4f2d-9186-3fb1daa4318e" (UID: "014742b0-d197-4f2d-9186-3fb1daa4318e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.666961 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014742b0-d197-4f2d-9186-3fb1daa4318e-log-httpd\") pod \"014742b0-d197-4f2d-9186-3fb1daa4318e\" (UID: \"014742b0-d197-4f2d-9186-3fb1daa4318e\") " Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.671952 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/014742b0-d197-4f2d-9186-3fb1daa4318e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "014742b0-d197-4f2d-9186-3fb1daa4318e" (UID: "014742b0-d197-4f2d-9186-3fb1daa4318e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.677798 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/014742b0-d197-4f2d-9186-3fb1daa4318e-kube-api-access-drjzg" (OuterVolumeSpecName: "kube-api-access-drjzg") pod "014742b0-d197-4f2d-9186-3fb1daa4318e" (UID: "014742b0-d197-4f2d-9186-3fb1daa4318e"). InnerVolumeSpecName "kube-api-access-drjzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.680259 4992 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d14b8bc4-65b8-4dce-aa68-0362de63be9d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.680286 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.680300 4992 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014742b0-d197-4f2d-9186-3fb1daa4318e-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.680315 4992 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/014742b0-d197-4f2d-9186-3fb1daa4318e-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.680327 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drjzg\" (UniqueName: \"kubernetes.io/projected/014742b0-d197-4f2d-9186-3fb1daa4318e-kube-api-access-drjzg\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.680340 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.680353 4992 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.680366 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6w9s\" (UniqueName: \"kubernetes.io/projected/d14b8bc4-65b8-4dce-aa68-0362de63be9d-kube-api-access-d6w9s\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.680378 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d14b8bc4-65b8-4dce-aa68-0362de63be9d-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.680389 4992 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d14b8bc4-65b8-4dce-aa68-0362de63be9d-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.684676 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.936688984 podStartE2EDuration="5.684652188s" podCreationTimestamp="2026-01-31 09:45:14 +0000 UTC" firstStartedPulling="2026-01-31 09:45:15.577550135 +0000 UTC m=+1211.548942122" lastFinishedPulling="2026-01-31 09:45:16.325513339 +0000 UTC m=+1212.296905326" observedRunningTime="2026-01-31 09:45:19.679327624 +0000 UTC m=+1215.650719641" watchObservedRunningTime="2026-01-31 09:45:19.684652188 +0000 UTC m=+1215.656044195" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.686695 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-scripts" (OuterVolumeSpecName: "scripts") pod "014742b0-d197-4f2d-9186-3fb1daa4318e" (UID: "014742b0-d197-4f2d-9186-3fb1daa4318e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.707523 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "014742b0-d197-4f2d-9186-3fb1daa4318e" (UID: "014742b0-d197-4f2d-9186-3fb1daa4318e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.707699 4992 scope.go:117] "RemoveContainer" containerID="0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65" Jan 31 09:45:19 crc kubenswrapper[4992]: E0131 09:45:19.710579 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65\": container with ID starting with 0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65 not found: ID does not exist" containerID="0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.710616 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65"} err="failed to get container status \"0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65\": rpc error: code = NotFound desc = could not find container \"0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65\": container with ID starting with 0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65 not found: ID does not exist" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.710640 4992 scope.go:117] "RemoveContainer" containerID="df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73" Jan 31 09:45:19 crc kubenswrapper[4992]: E0131 09:45:19.711715 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73\": container with ID starting with df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73 not found: ID does not exist" containerID="df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.711791 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73"} err="failed to get container status \"df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73\": rpc error: code = NotFound desc = could not find container \"df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73\": container with ID starting with df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73 not found: ID does not exist" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.711855 4992 scope.go:117] "RemoveContainer" containerID="0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.712219 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65"} err="failed to get container status \"0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65\": rpc error: code = NotFound desc = could not find container \"0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65\": container with ID starting with 0593d80661002d13800ab05b1bc9cac584b64dff7221e8ad7e6c17707032df65 not found: ID does not exist" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.712243 4992 scope.go:117] "RemoveContainer" containerID="df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.712611 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73"} err="failed to get container status \"df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73\": rpc error: code = NotFound desc = could not find container \"df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73\": container with ID starting with df133e8d09d69824755a2e16c8fabdf3ef1c4e48f6e48b2598486d7e8b734a73 not found: ID does not exist" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.712634 4992 scope.go:117] "RemoveContainer" containerID="df630815df72f8587ad3b6d057ff48adb51c66196406bd535e51e1081c241e6d" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.780861 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "014742b0-d197-4f2d-9186-3fb1daa4318e" (UID: "014742b0-d197-4f2d-9186-3fb1daa4318e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.782137 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.782163 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.782175 4992 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.788425 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-config-data" (OuterVolumeSpecName: "config-data") pod "014742b0-d197-4f2d-9186-3fb1daa4318e" (UID: "014742b0-d197-4f2d-9186-3fb1daa4318e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.868121 4992 scope.go:117] "RemoveContainer" containerID="7b937a4004691f2f165084eb4019197a06c9318f767b4288d38811fb1be2cf24" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.884093 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014742b0-d197-4f2d-9186-3fb1daa4318e-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.891797 4992 scope.go:117] "RemoveContainer" containerID="06b73264b209677c52611b4cddf3bccca55e8e0c96743ceca7fdff048ee2011d" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.918323 4992 scope.go:117] "RemoveContainer" containerID="20fa91fbba6cc74e59aebc509d7cbb6437d5c60aadc4238709bd45e2c190fe78" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.979897 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.994958 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 31 09:45:19 crc kubenswrapper[4992]: I0131 09:45:19.996993 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.013108 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:45:20 crc kubenswrapper[4992]: E0131 09:45:20.013633 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14b8bc4-65b8-4dce-aa68-0362de63be9d" containerName="cinder-api-log" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.013663 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14b8bc4-65b8-4dce-aa68-0362de63be9d" containerName="cinder-api-log" Jan 31 09:45:20 crc kubenswrapper[4992]: E0131 09:45:20.013690 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="sg-core" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.013699 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="sg-core" Jan 31 09:45:20 crc kubenswrapper[4992]: E0131 09:45:20.013718 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="ceilometer-central-agent" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.013726 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="ceilometer-central-agent" Jan 31 09:45:20 crc kubenswrapper[4992]: E0131 09:45:20.013742 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14b8bc4-65b8-4dce-aa68-0362de63be9d" containerName="cinder-api" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.013750 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14b8bc4-65b8-4dce-aa68-0362de63be9d" containerName="cinder-api" Jan 31 09:45:20 crc kubenswrapper[4992]: E0131 09:45:20.013766 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb03cbe4-dd01-4ffb-a581-8d16b41c2102" containerName="init" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.013774 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb03cbe4-dd01-4ffb-a581-8d16b41c2102" containerName="init" Jan 31 09:45:20 crc kubenswrapper[4992]: E0131 09:45:20.013788 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="ceilometer-notification-agent" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.013797 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="ceilometer-notification-agent" Jan 31 09:45:20 crc kubenswrapper[4992]: E0131 09:45:20.013813 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="proxy-httpd" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.013820 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="proxy-httpd" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.014007 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d14b8bc4-65b8-4dce-aa68-0362de63be9d" containerName="cinder-api" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.014028 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="ceilometer-central-agent" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.014045 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="proxy-httpd" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.014052 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb03cbe4-dd01-4ffb-a581-8d16b41c2102" containerName="init" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.014063 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="ceilometer-notification-agent" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.014069 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" containerName="sg-core" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.014078 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d14b8bc4-65b8-4dce-aa68-0362de63be9d" containerName="cinder-api-log" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.015216 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.022007 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.022202 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.023700 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.027624 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.038459 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.048485 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.057827 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.060107 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.065075 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.065449 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.088878 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.088929 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f8wb\" (UniqueName: \"kubernetes.io/projected/d07041cf-fa25-4d72-9ff6-c756c3ced72f-kube-api-access-5f8wb\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.089004 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-config-data\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.089022 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.089044 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-config-data\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.089065 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-scripts\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.089095 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.089113 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d07041cf-fa25-4d72-9ff6-c756c3ced72f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.089130 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.089152 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsjht\" (UniqueName: \"kubernetes.io/projected/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-kube-api-access-lsjht\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.089170 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.089203 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-config-data-custom\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.089222 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-run-httpd\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.089254 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-scripts\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.089281 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-log-httpd\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.089300 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d07041cf-fa25-4d72-9ff6-c756c3ced72f-logs\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.110056 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194163 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194220 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f8wb\" (UniqueName: \"kubernetes.io/projected/d07041cf-fa25-4d72-9ff6-c756c3ced72f-kube-api-access-5f8wb\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194307 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-config-data\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194342 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194385 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-config-data\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194445 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-scripts\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194501 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194561 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d07041cf-fa25-4d72-9ff6-c756c3ced72f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194585 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194616 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsjht\" (UniqueName: \"kubernetes.io/projected/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-kube-api-access-lsjht\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194647 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194676 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-config-data-custom\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194708 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-run-httpd\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194752 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-scripts\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194791 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-log-httpd\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.194817 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d07041cf-fa25-4d72-9ff6-c756c3ced72f-logs\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.195357 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d07041cf-fa25-4d72-9ff6-c756c3ced72f-logs\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.196166 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d07041cf-fa25-4d72-9ff6-c756c3ced72f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.202230 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-config-data\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.203158 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-run-httpd\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.203477 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-log-httpd\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.205361 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6958dd7cf4-b96rn"] Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.208810 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-config-data\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.209138 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.210964 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.211030 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-scripts\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.211079 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-config-data-custom\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.211354 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-scripts\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.211407 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.216919 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d07041cf-fa25-4d72-9ff6-c756c3ced72f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.217268 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.219100 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.220205 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6958dd7cf4-b96rn"] Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.220373 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.220652 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.221512 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsjht\" (UniqueName: \"kubernetes.io/projected/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-kube-api-access-lsjht\") pod \"ceilometer-0\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.237859 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f8wb\" (UniqueName: \"kubernetes.io/projected/d07041cf-fa25-4d72-9ff6-c756c3ced72f-kube-api-access-5f8wb\") pod \"cinder-api-0\" (UID: \"d07041cf-fa25-4d72-9ff6-c756c3ced72f\") " pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.296358 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c9nl\" (UniqueName: \"kubernetes.io/projected/d2f2b0bd-990b-43a8-93e0-70564d946308-kube-api-access-8c9nl\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.296436 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2f2b0bd-990b-43a8-93e0-70564d946308-config-data-custom\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.296534 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2f2b0bd-990b-43a8-93e0-70564d946308-logs\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.296627 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f2b0bd-990b-43a8-93e0-70564d946308-public-tls-certs\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.296818 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f2b0bd-990b-43a8-93e0-70564d946308-config-data\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.297036 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f2b0bd-990b-43a8-93e0-70564d946308-internal-tls-certs\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.297067 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f2b0bd-990b-43a8-93e0-70564d946308-combined-ca-bundle\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.339507 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.393504 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.398461 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f2b0bd-990b-43a8-93e0-70564d946308-combined-ca-bundle\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.398512 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f2b0bd-990b-43a8-93e0-70564d946308-internal-tls-certs\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.398566 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c9nl\" (UniqueName: \"kubernetes.io/projected/d2f2b0bd-990b-43a8-93e0-70564d946308-kube-api-access-8c9nl\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.398589 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2f2b0bd-990b-43a8-93e0-70564d946308-config-data-custom\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.398611 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2f2b0bd-990b-43a8-93e0-70564d946308-logs\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.398641 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f2b0bd-990b-43a8-93e0-70564d946308-public-tls-certs\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.398689 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f2b0bd-990b-43a8-93e0-70564d946308-config-data\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.401821 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2f2b0bd-990b-43a8-93e0-70564d946308-logs\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.402982 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f2b0bd-990b-43a8-93e0-70564d946308-combined-ca-bundle\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.404190 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f2b0bd-990b-43a8-93e0-70564d946308-config-data\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.404204 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f2b0bd-990b-43a8-93e0-70564d946308-internal-tls-certs\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.405615 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2f2b0bd-990b-43a8-93e0-70564d946308-config-data-custom\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.411194 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f2b0bd-990b-43a8-93e0-70564d946308-public-tls-certs\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.417230 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c9nl\" (UniqueName: \"kubernetes.io/projected/d2f2b0bd-990b-43a8-93e0-70564d946308-kube-api-access-8c9nl\") pod \"barbican-api-6958dd7cf4-b96rn\" (UID: \"d2f2b0bd-990b-43a8-93e0-70564d946308\") " pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.605678 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.848498 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 31 09:45:20 crc kubenswrapper[4992]: I0131 09:45:20.961795 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:21 crc kubenswrapper[4992]: I0131 09:45:21.130557 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6958dd7cf4-b96rn"] Jan 31 09:45:21 crc kubenswrapper[4992]: I0131 09:45:21.197147 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="014742b0-d197-4f2d-9186-3fb1daa4318e" path="/var/lib/kubelet/pods/014742b0-d197-4f2d-9186-3fb1daa4318e/volumes" Jan 31 09:45:21 crc kubenswrapper[4992]: I0131 09:45:21.198517 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d14b8bc4-65b8-4dce-aa68-0362de63be9d" path="/var/lib/kubelet/pods/d14b8bc4-65b8-4dce-aa68-0362de63be9d/volumes" Jan 31 09:45:21 crc kubenswrapper[4992]: I0131 09:45:21.675147 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6958dd7cf4-b96rn" event={"ID":"d2f2b0bd-990b-43a8-93e0-70564d946308","Type":"ContainerStarted","Data":"a9244ae522f5e9001df534975b2546e86b29c0ffd5e4fe6d8ba89081dc87c4d9"} Jan 31 09:45:21 crc kubenswrapper[4992]: I0131 09:45:21.675525 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:21 crc kubenswrapper[4992]: I0131 09:45:21.675542 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:21 crc kubenswrapper[4992]: I0131 09:45:21.675552 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6958dd7cf4-b96rn" event={"ID":"d2f2b0bd-990b-43a8-93e0-70564d946308","Type":"ContainerStarted","Data":"8fa362cd5929e46d6c9bdbe34728db5733638cded8db31086c1086b5fcbda806"} Jan 31 09:45:21 crc kubenswrapper[4992]: I0131 09:45:21.675564 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6958dd7cf4-b96rn" event={"ID":"d2f2b0bd-990b-43a8-93e0-70564d946308","Type":"ContainerStarted","Data":"eac06d3d445ee22228de5ef3db2199340aba3bd233baf72a949c8d960770054a"} Jan 31 09:45:21 crc kubenswrapper[4992]: I0131 09:45:21.678950 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d07041cf-fa25-4d72-9ff6-c756c3ced72f","Type":"ContainerStarted","Data":"a6e00a63d3065e46809c52f3057d4a0acb7d61937f5f402de4e096ccf75a9e09"} Jan 31 09:45:21 crc kubenswrapper[4992]: I0131 09:45:21.678974 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d07041cf-fa25-4d72-9ff6-c756c3ced72f","Type":"ContainerStarted","Data":"b23d0002ec2c6d5d6f271f1efc199aa944bc0b72c1a7e6bb8dfd5aa36d63ba04"} Jan 31 09:45:21 crc kubenswrapper[4992]: I0131 09:45:21.681205 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9234f7bc-5176-4c28-9ad0-e7f3e41d4935","Type":"ContainerStarted","Data":"c5713225fc4742aa7bf5f7dc9fd4d743019b6d1ff29be04392a166abfbe31f6f"} Jan 31 09:45:21 crc kubenswrapper[4992]: I0131 09:45:21.681239 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9234f7bc-5176-4c28-9ad0-e7f3e41d4935","Type":"ContainerStarted","Data":"b8d6d162f776eb5e13eb352dcace54917976b629ce59636ecfda2db36d3d2fd4"} Jan 31 09:45:21 crc kubenswrapper[4992]: I0131 09:45:21.709194 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6958dd7cf4-b96rn" podStartSLOduration=1.709169788 podStartE2EDuration="1.709169788s" podCreationTimestamp="2026-01-31 09:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:45:21.694709031 +0000 UTC m=+1217.666101038" watchObservedRunningTime="2026-01-31 09:45:21.709169788 +0000 UTC m=+1217.680561775" Jan 31 09:45:22 crc kubenswrapper[4992]: I0131 09:45:22.690687 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d07041cf-fa25-4d72-9ff6-c756c3ced72f","Type":"ContainerStarted","Data":"531c38931bad37b0dc2327e77eeb8eea773cf9163057016cdebfb927d15db80b"} Jan 31 09:45:22 crc kubenswrapper[4992]: I0131 09:45:22.691357 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 31 09:45:22 crc kubenswrapper[4992]: I0131 09:45:22.695372 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9234f7bc-5176-4c28-9ad0-e7f3e41d4935","Type":"ContainerStarted","Data":"a556856d552f716e821fdb5d0dfd3783593b935d8ae65c8e42130b15389ba929"} Jan 31 09:45:22 crc kubenswrapper[4992]: I0131 09:45:22.728348 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.728327876 podStartE2EDuration="3.728327876s" podCreationTimestamp="2026-01-31 09:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:45:22.710190763 +0000 UTC m=+1218.681582760" watchObservedRunningTime="2026-01-31 09:45:22.728327876 +0000 UTC m=+1218.699719873" Jan 31 09:45:23 crc kubenswrapper[4992]: I0131 09:45:23.713536 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9234f7bc-5176-4c28-9ad0-e7f3e41d4935","Type":"ContainerStarted","Data":"daf45301efae7bb2636169a5409b6d65405808aa0b990465021317fe834cbed4"} Jan 31 09:45:24 crc kubenswrapper[4992]: I0131 09:45:24.659119 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:45:24 crc kubenswrapper[4992]: I0131 09:45:24.659658 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:45:24 crc kubenswrapper[4992]: I0131 09:45:24.851936 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.015319 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-694d5dc9d5-kr7sf"] Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.019179 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-694d5dc9d5-kr7sf" podUID="3665c963-d0e3-4317-9bd8-50cc6d7bff5a" containerName="neutron-httpd" containerID="cri-o://7530863dd25130eca663eeecef851785104f103d648bcad98c3e2140af7b6b11" gracePeriod=30 Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.019371 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-694d5dc9d5-kr7sf" podUID="3665c963-d0e3-4317-9bd8-50cc6d7bff5a" containerName="neutron-api" containerID="cri-o://239b180e3c7322ffc9253cb2e4049d21ad52ac110469b2670d6b859cbcadf901" gracePeriod=30 Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.029634 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-694d5dc9d5-kr7sf" podUID="3665c963-d0e3-4317-9bd8-50cc6d7bff5a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.145:9696/\": read tcp 10.217.0.2:49272->10.217.0.145:9696: read: connection reset by peer" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.051006 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-98d958769-wtkh9"] Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.052389 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.071290 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-98d958769-wtkh9"] Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.095379 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-combined-ca-bundle\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.095438 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv76r\" (UniqueName: \"kubernetes.io/projected/987e9d9b-684a-4d52-adb0-bf76c86e9999-kube-api-access-rv76r\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.095508 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-public-tls-certs\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.095544 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-internal-tls-certs\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.095568 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-httpd-config\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.095584 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-ovndb-tls-certs\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.095612 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-config\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.197118 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-config\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.197914 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-combined-ca-bundle\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.197937 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv76r\" (UniqueName: \"kubernetes.io/projected/987e9d9b-684a-4d52-adb0-bf76c86e9999-kube-api-access-rv76r\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.198006 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-public-tls-certs\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.198044 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-internal-tls-certs\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.198070 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-httpd-config\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.198091 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-ovndb-tls-certs\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.200100 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.201947 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-config\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.203608 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-ovndb-tls-certs\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.206054 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-public-tls-certs\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.206295 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-httpd-config\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.215884 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-internal-tls-certs\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.227885 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987e9d9b-684a-4d52-adb0-bf76c86e9999-combined-ca-bundle\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.242805 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv76r\" (UniqueName: \"kubernetes.io/projected/987e9d9b-684a-4d52-adb0-bf76c86e9999-kube-api-access-rv76r\") pod \"neutron-98d958769-wtkh9\" (UID: \"987e9d9b-684a-4d52-adb0-bf76c86e9999\") " pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.304188 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-m8sjm"] Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.304716 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" podUID="6a93fcde-6251-4bfd-8b00-e8c1b7f233b3" containerName="dnsmasq-dns" containerID="cri-o://4a9148aae46144f5a3fbd2baca2ad1b7cb5a1c167629337ce4eed1b9d7243b4a" gracePeriod=10 Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.396137 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.432065 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.486969 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.764384 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9234f7bc-5176-4c28-9ad0-e7f3e41d4935","Type":"ContainerStarted","Data":"470ba378c46f75f9f0bf00451ea853ee74ca3376c2d3ce8f1e25fefa85591630"} Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.766207 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.776188 4992 generic.go:334] "Generic (PLEG): container finished" podID="6a93fcde-6251-4bfd-8b00-e8c1b7f233b3" containerID="4a9148aae46144f5a3fbd2baca2ad1b7cb5a1c167629337ce4eed1b9d7243b4a" exitCode=0 Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.776406 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" event={"ID":"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3","Type":"ContainerDied","Data":"4a9148aae46144f5a3fbd2baca2ad1b7cb5a1c167629337ce4eed1b9d7243b4a"} Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.802295 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.996929229 podStartE2EDuration="5.802279065s" podCreationTimestamp="2026-01-31 09:45:20 +0000 UTC" firstStartedPulling="2026-01-31 09:45:20.972002325 +0000 UTC m=+1216.943394322" lastFinishedPulling="2026-01-31 09:45:24.777352171 +0000 UTC m=+1220.748744158" observedRunningTime="2026-01-31 09:45:25.799258468 +0000 UTC m=+1221.770650475" watchObservedRunningTime="2026-01-31 09:45:25.802279065 +0000 UTC m=+1221.773671052" Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.804126 4992 generic.go:334] "Generic (PLEG): container finished" podID="3665c963-d0e3-4317-9bd8-50cc6d7bff5a" containerID="7530863dd25130eca663eeecef851785104f103d648bcad98c3e2140af7b6b11" exitCode=0 Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.804320 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a7852c78-5488-4147-8160-f521bcdb5075" containerName="cinder-scheduler" containerID="cri-o://7b3b8313f535d16404f9acdded4ef888d1ac328000359cccc564a1ca4e6a5807" gracePeriod=30 Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.804630 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-694d5dc9d5-kr7sf" event={"ID":"3665c963-d0e3-4317-9bd8-50cc6d7bff5a","Type":"ContainerDied","Data":"7530863dd25130eca663eeecef851785104f103d648bcad98c3e2140af7b6b11"} Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.805218 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a7852c78-5488-4147-8160-f521bcdb5075" containerName="probe" containerID="cri-o://50cad3134a8b08211c079d64aa6d01aa9ba5fcc68f7b5bc3a122cb9f4fee20c6" gracePeriod=30 Jan 31 09:45:25 crc kubenswrapper[4992]: I0131 09:45:25.957001 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.014003 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzhkz\" (UniqueName: \"kubernetes.io/projected/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-kube-api-access-zzhkz\") pod \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.014058 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-dns-svc\") pod \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.014136 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-ovsdbserver-nb\") pod \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.014195 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-config\") pod \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.015004 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-ovsdbserver-sb\") pod \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\" (UID: \"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3\") " Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.041774 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-kube-api-access-zzhkz" (OuterVolumeSpecName: "kube-api-access-zzhkz") pod "6a93fcde-6251-4bfd-8b00-e8c1b7f233b3" (UID: "6a93fcde-6251-4bfd-8b00-e8c1b7f233b3"). InnerVolumeSpecName "kube-api-access-zzhkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.087806 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.117142 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzhkz\" (UniqueName: \"kubernetes.io/projected/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-kube-api-access-zzhkz\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.119324 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6a93fcde-6251-4bfd-8b00-e8c1b7f233b3" (UID: "6a93fcde-6251-4bfd-8b00-e8c1b7f233b3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.119729 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-config" (OuterVolumeSpecName: "config") pod "6a93fcde-6251-4bfd-8b00-e8c1b7f233b3" (UID: "6a93fcde-6251-4bfd-8b00-e8c1b7f233b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.147942 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.151844 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6a93fcde-6251-4bfd-8b00-e8c1b7f233b3" (UID: "6a93fcde-6251-4bfd-8b00-e8c1b7f233b3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.159954 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6a93fcde-6251-4bfd-8b00-e8c1b7f233b3" (UID: "6a93fcde-6251-4bfd-8b00-e8c1b7f233b3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.218672 4992 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.218701 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.218713 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.218723 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.610169 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-98d958769-wtkh9"] Jan 31 09:45:26 crc kubenswrapper[4992]: W0131 09:45:26.620304 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod987e9d9b_684a_4d52_adb0_bf76c86e9999.slice/crio-d545bf78ddc5ab3c9357b8515aa776112335a7c8e8993248953b1cc43f31952d WatchSource:0}: Error finding container d545bf78ddc5ab3c9357b8515aa776112335a7c8e8993248953b1cc43f31952d: Status 404 returned error can't find the container with id d545bf78ddc5ab3c9357b8515aa776112335a7c8e8993248953b1cc43f31952d Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.821448 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" event={"ID":"6a93fcde-6251-4bfd-8b00-e8c1b7f233b3","Type":"ContainerDied","Data":"d1b0de2dca7fc3735cfe33ab239456f9530332e08ecc5be29492a91a67ada5c0"} Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.822288 4992 scope.go:117] "RemoveContainer" containerID="4a9148aae46144f5a3fbd2baca2ad1b7cb5a1c167629337ce4eed1b9d7243b4a" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.821515 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-m8sjm" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.823619 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98d958769-wtkh9" event={"ID":"987e9d9b-684a-4d52-adb0-bf76c86e9999","Type":"ContainerStarted","Data":"d545bf78ddc5ab3c9357b8515aa776112335a7c8e8993248953b1cc43f31952d"} Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.827997 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-694d5dc9d5-kr7sf" podUID="3665c963-d0e3-4317-9bd8-50cc6d7bff5a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.145:9696/\": dial tcp 10.217.0.145:9696: connect: connection refused" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.895277 4992 scope.go:117] "RemoveContainer" containerID="4f23ef9dc3234c0368c8e17634eba80dc7303ca9533175841a6bc6ec0c72dea9" Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.928854 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-m8sjm"] Jan 31 09:45:26 crc kubenswrapper[4992]: I0131 09:45:26.953277 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-m8sjm"] Jan 31 09:45:27 crc kubenswrapper[4992]: I0131 09:45:27.007601 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:45:27 crc kubenswrapper[4992]: I0131 09:45:27.203947 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a93fcde-6251-4bfd-8b00-e8c1b7f233b3" path="/var/lib/kubelet/pods/6a93fcde-6251-4bfd-8b00-e8c1b7f233b3/volumes" Jan 31 09:45:27 crc kubenswrapper[4992]: I0131 09:45:27.529065 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5d6fc5dc84-n2kln" Jan 31 09:45:27 crc kubenswrapper[4992]: I0131 09:45:27.619652 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-65f6bf6f54-x2b8z"] Jan 31 09:45:27 crc kubenswrapper[4992]: I0131 09:45:27.833813 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98d958769-wtkh9" event={"ID":"987e9d9b-684a-4d52-adb0-bf76c86e9999","Type":"ContainerStarted","Data":"11c5b66b86d9212f968a52713cccc2bbbad0f95827f6d75aef8c9de7f8b8f39a"} Jan 31 09:45:27 crc kubenswrapper[4992]: I0131 09:45:27.833856 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98d958769-wtkh9" event={"ID":"987e9d9b-684a-4d52-adb0-bf76c86e9999","Type":"ContainerStarted","Data":"ac982c71f17c3e569c8ad8b8f72f6dc0c49e25cb1946544342d9e9d9367dfc6b"} Jan 31 09:45:27 crc kubenswrapper[4992]: I0131 09:45:27.834147 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:27 crc kubenswrapper[4992]: I0131 09:45:27.838888 4992 generic.go:334] "Generic (PLEG): container finished" podID="a7852c78-5488-4147-8160-f521bcdb5075" containerID="50cad3134a8b08211c079d64aa6d01aa9ba5fcc68f7b5bc3a122cb9f4fee20c6" exitCode=0 Jan 31 09:45:27 crc kubenswrapper[4992]: I0131 09:45:27.839984 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a7852c78-5488-4147-8160-f521bcdb5075","Type":"ContainerDied","Data":"50cad3134a8b08211c079d64aa6d01aa9ba5fcc68f7b5bc3a122cb9f4fee20c6"} Jan 31 09:45:27 crc kubenswrapper[4992]: I0131 09:45:27.840112 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-65f6bf6f54-x2b8z" podUID="04ff2a8b-a743-475e-9ae5-5fb98839ba57" containerName="horizon-log" containerID="cri-o://235244b99bd6545b56a68bbe33b171a8a66da4e113507373bfd664c22bb694dc" gracePeriod=30 Jan 31 09:45:27 crc kubenswrapper[4992]: I0131 09:45:27.840366 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-65f6bf6f54-x2b8z" podUID="04ff2a8b-a743-475e-9ae5-5fb98839ba57" containerName="horizon" containerID="cri-o://ca25af36153839640483f7960a573ac8a9e0baf0669dab993d3eee4217e0f72d" gracePeriod=30 Jan 31 09:45:27 crc kubenswrapper[4992]: I0131 09:45:27.863946 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-98d958769-wtkh9" podStartSLOduration=2.863923466 podStartE2EDuration="2.863923466s" podCreationTimestamp="2026-01-31 09:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:45:27.862288418 +0000 UTC m=+1223.833680425" watchObservedRunningTime="2026-01-31 09:45:27.863923466 +0000 UTC m=+1223.835315453" Jan 31 09:45:29 crc kubenswrapper[4992]: I0131 09:45:29.872980 4992 generic.go:334] "Generic (PLEG): container finished" podID="a7852c78-5488-4147-8160-f521bcdb5075" containerID="7b3b8313f535d16404f9acdded4ef888d1ac328000359cccc564a1ca4e6a5807" exitCode=0 Jan 31 09:45:29 crc kubenswrapper[4992]: I0131 09:45:29.874029 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a7852c78-5488-4147-8160-f521bcdb5075","Type":"ContainerDied","Data":"7b3b8313f535d16404f9acdded4ef888d1ac328000359cccc564a1ca4e6a5807"} Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.014896 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.207986 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-config-data-custom\") pod \"a7852c78-5488-4147-8160-f521bcdb5075\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.208044 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-config-data\") pod \"a7852c78-5488-4147-8160-f521bcdb5075\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.208209 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75fzw\" (UniqueName: \"kubernetes.io/projected/a7852c78-5488-4147-8160-f521bcdb5075-kube-api-access-75fzw\") pod \"a7852c78-5488-4147-8160-f521bcdb5075\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.208286 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-combined-ca-bundle\") pod \"a7852c78-5488-4147-8160-f521bcdb5075\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.208348 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7852c78-5488-4147-8160-f521bcdb5075-etc-machine-id\") pod \"a7852c78-5488-4147-8160-f521bcdb5075\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.208375 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-scripts\") pod \"a7852c78-5488-4147-8160-f521bcdb5075\" (UID: \"a7852c78-5488-4147-8160-f521bcdb5075\") " Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.208992 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7852c78-5488-4147-8160-f521bcdb5075-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a7852c78-5488-4147-8160-f521bcdb5075" (UID: "a7852c78-5488-4147-8160-f521bcdb5075"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.220678 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a7852c78-5488-4147-8160-f521bcdb5075" (UID: "a7852c78-5488-4147-8160-f521bcdb5075"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.220863 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7852c78-5488-4147-8160-f521bcdb5075-kube-api-access-75fzw" (OuterVolumeSpecName: "kube-api-access-75fzw") pod "a7852c78-5488-4147-8160-f521bcdb5075" (UID: "a7852c78-5488-4147-8160-f521bcdb5075"). InnerVolumeSpecName "kube-api-access-75fzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.220928 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-scripts" (OuterVolumeSpecName: "scripts") pod "a7852c78-5488-4147-8160-f521bcdb5075" (UID: "a7852c78-5488-4147-8160-f521bcdb5075"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.293910 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7852c78-5488-4147-8160-f521bcdb5075" (UID: "a7852c78-5488-4147-8160-f521bcdb5075"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.311150 4992 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.311189 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75fzw\" (UniqueName: \"kubernetes.io/projected/a7852c78-5488-4147-8160-f521bcdb5075-kube-api-access-75fzw\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.311203 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.311214 4992 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7852c78-5488-4147-8160-f521bcdb5075-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.311225 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.367574 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-config-data" (OuterVolumeSpecName: "config-data") pod "a7852c78-5488-4147-8160-f521bcdb5075" (UID: "a7852c78-5488-4147-8160-f521bcdb5075"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.413951 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7852c78-5488-4147-8160-f521bcdb5075-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.884819 4992 generic.go:334] "Generic (PLEG): container finished" podID="3665c963-d0e3-4317-9bd8-50cc6d7bff5a" containerID="239b180e3c7322ffc9253cb2e4049d21ad52ac110469b2670d6b859cbcadf901" exitCode=0 Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.884878 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-694d5dc9d5-kr7sf" event={"ID":"3665c963-d0e3-4317-9bd8-50cc6d7bff5a","Type":"ContainerDied","Data":"239b180e3c7322ffc9253cb2e4049d21ad52ac110469b2670d6b859cbcadf901"} Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.888847 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a7852c78-5488-4147-8160-f521bcdb5075","Type":"ContainerDied","Data":"5b3e0963f9591945cf42c788604bbeb2d5a2d47c31554d71a4c8d7b9c907912b"} Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.888902 4992 scope.go:117] "RemoveContainer" containerID="50cad3134a8b08211c079d64aa6d01aa9ba5fcc68f7b5bc3a122cb9f4fee20c6" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.888962 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.930585 4992 scope.go:117] "RemoveContainer" containerID="7b3b8313f535d16404f9acdded4ef888d1ac328000359cccc564a1ca4e6a5807" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.931480 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.957116 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.978721 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:45:30 crc kubenswrapper[4992]: E0131 09:45:30.979229 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a93fcde-6251-4bfd-8b00-e8c1b7f233b3" containerName="init" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.979246 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a93fcde-6251-4bfd-8b00-e8c1b7f233b3" containerName="init" Jan 31 09:45:30 crc kubenswrapper[4992]: E0131 09:45:30.979263 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a93fcde-6251-4bfd-8b00-e8c1b7f233b3" containerName="dnsmasq-dns" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.979275 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a93fcde-6251-4bfd-8b00-e8c1b7f233b3" containerName="dnsmasq-dns" Jan 31 09:45:30 crc kubenswrapper[4992]: E0131 09:45:30.979309 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7852c78-5488-4147-8160-f521bcdb5075" containerName="probe" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.979318 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7852c78-5488-4147-8160-f521bcdb5075" containerName="probe" Jan 31 09:45:30 crc kubenswrapper[4992]: E0131 09:45:30.979345 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7852c78-5488-4147-8160-f521bcdb5075" containerName="cinder-scheduler" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.979354 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7852c78-5488-4147-8160-f521bcdb5075" containerName="cinder-scheduler" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.979757 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a93fcde-6251-4bfd-8b00-e8c1b7f233b3" containerName="dnsmasq-dns" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.979821 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7852c78-5488-4147-8160-f521bcdb5075" containerName="cinder-scheduler" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.979854 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7852c78-5488-4147-8160-f521bcdb5075" containerName="probe" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.981211 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 31 09:45:30 crc kubenswrapper[4992]: I0131 09:45:30.987997 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.008609 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.125119 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz2m2\" (UniqueName: \"kubernetes.io/projected/233c7d9d-d8e2-456d-8382-bbc880debb01-kube-api-access-fz2m2\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.125474 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/233c7d9d-d8e2-456d-8382-bbc880debb01-scripts\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.125498 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/233c7d9d-d8e2-456d-8382-bbc880debb01-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.125541 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/233c7d9d-d8e2-456d-8382-bbc880debb01-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.125559 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233c7d9d-d8e2-456d-8382-bbc880debb01-config-data\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.125589 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233c7d9d-d8e2-456d-8382-bbc880debb01-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.198659 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7852c78-5488-4147-8160-f521bcdb5075" path="/var/lib/kubelet/pods/a7852c78-5488-4147-8160-f521bcdb5075/volumes" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.227578 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz2m2\" (UniqueName: \"kubernetes.io/projected/233c7d9d-d8e2-456d-8382-bbc880debb01-kube-api-access-fz2m2\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.227626 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/233c7d9d-d8e2-456d-8382-bbc880debb01-scripts\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.227650 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/233c7d9d-d8e2-456d-8382-bbc880debb01-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.227687 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/233c7d9d-d8e2-456d-8382-bbc880debb01-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.227718 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233c7d9d-d8e2-456d-8382-bbc880debb01-config-data\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.227746 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233c7d9d-d8e2-456d-8382-bbc880debb01-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.227820 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/233c7d9d-d8e2-456d-8382-bbc880debb01-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.237094 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233c7d9d-d8e2-456d-8382-bbc880debb01-config-data\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.237894 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233c7d9d-d8e2-456d-8382-bbc880debb01-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.237900 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/233c7d9d-d8e2-456d-8382-bbc880debb01-scripts\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.240936 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/233c7d9d-d8e2-456d-8382-bbc880debb01-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.247105 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz2m2\" (UniqueName: \"kubernetes.io/projected/233c7d9d-d8e2-456d-8382-bbc880debb01-kube-api-access-fz2m2\") pod \"cinder-scheduler-0\" (UID: \"233c7d9d-d8e2-456d-8382-bbc880debb01\") " pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.319768 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.323697 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.430799 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-internal-tls-certs\") pod \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.430897 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-httpd-config\") pod \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.430932 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-public-tls-certs\") pod \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.430960 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-ovndb-tls-certs\") pod \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.430993 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-config\") pod \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.431109 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq97v\" (UniqueName: \"kubernetes.io/projected/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-kube-api-access-qq97v\") pod \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.431160 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-combined-ca-bundle\") pod \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\" (UID: \"3665c963-d0e3-4317-9bd8-50cc6d7bff5a\") " Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.456178 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-kube-api-access-qq97v" (OuterVolumeSpecName: "kube-api-access-qq97v") pod "3665c963-d0e3-4317-9bd8-50cc6d7bff5a" (UID: "3665c963-d0e3-4317-9bd8-50cc6d7bff5a"). InnerVolumeSpecName "kube-api-access-qq97v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.473510 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3665c963-d0e3-4317-9bd8-50cc6d7bff5a" (UID: "3665c963-d0e3-4317-9bd8-50cc6d7bff5a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.532634 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq97v\" (UniqueName: \"kubernetes.io/projected/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-kube-api-access-qq97v\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.532657 4992 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.594778 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3665c963-d0e3-4317-9bd8-50cc6d7bff5a" (UID: "3665c963-d0e3-4317-9bd8-50cc6d7bff5a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.596749 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-config" (OuterVolumeSpecName: "config") pod "3665c963-d0e3-4317-9bd8-50cc6d7bff5a" (UID: "3665c963-d0e3-4317-9bd8-50cc6d7bff5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.610015 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3665c963-d0e3-4317-9bd8-50cc6d7bff5a" (UID: "3665c963-d0e3-4317-9bd8-50cc6d7bff5a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.627133 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3665c963-d0e3-4317-9bd8-50cc6d7bff5a" (UID: "3665c963-d0e3-4317-9bd8-50cc6d7bff5a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.628034 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3665c963-d0e3-4317-9bd8-50cc6d7bff5a" (UID: "3665c963-d0e3-4317-9bd8-50cc6d7bff5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.634610 4992 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.634640 4992 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.634649 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.634660 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.634688 4992 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3665c963-d0e3-4317-9bd8-50cc6d7bff5a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.838281 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.904221 4992 generic.go:334] "Generic (PLEG): container finished" podID="04ff2a8b-a743-475e-9ae5-5fb98839ba57" containerID="ca25af36153839640483f7960a573ac8a9e0baf0669dab993d3eee4217e0f72d" exitCode=0 Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.904308 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65f6bf6f54-x2b8z" event={"ID":"04ff2a8b-a743-475e-9ae5-5fb98839ba57","Type":"ContainerDied","Data":"ca25af36153839640483f7960a573ac8a9e0baf0669dab993d3eee4217e0f72d"} Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.909866 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"233c7d9d-d8e2-456d-8382-bbc880debb01","Type":"ContainerStarted","Data":"4d8f8af6d9004519769a82d579b9dfadf89108f6e950e27652ab69d4146b49c9"} Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.955187 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-694d5dc9d5-kr7sf" event={"ID":"3665c963-d0e3-4317-9bd8-50cc6d7bff5a","Type":"ContainerDied","Data":"59db46195bce2b5d41af5cf68bff7eda52196ee04400f335a9a7f409d7bbf23f"} Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.955471 4992 scope.go:117] "RemoveContainer" containerID="7530863dd25130eca663eeecef851785104f103d648bcad98c3e2140af7b6b11" Jan 31 09:45:31 crc kubenswrapper[4992]: I0131 09:45:31.955246 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-694d5dc9d5-kr7sf" Jan 31 09:45:32 crc kubenswrapper[4992]: I0131 09:45:32.032260 4992 scope.go:117] "RemoveContainer" containerID="239b180e3c7322ffc9253cb2e4049d21ad52ac110469b2670d6b859cbcadf901" Jan 31 09:45:32 crc kubenswrapper[4992]: I0131 09:45:32.032500 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-694d5dc9d5-kr7sf"] Jan 31 09:45:32 crc kubenswrapper[4992]: I0131 09:45:32.039289 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-694d5dc9d5-kr7sf"] Jan 31 09:45:32 crc kubenswrapper[4992]: I0131 09:45:32.437561 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:32 crc kubenswrapper[4992]: I0131 09:45:32.467313 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6958dd7cf4-b96rn" Jan 31 09:45:32 crc kubenswrapper[4992]: I0131 09:45:32.547165 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-76ff5b49fd-jljf2"] Jan 31 09:45:32 crc kubenswrapper[4992]: I0131 09:45:32.547375 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-76ff5b49fd-jljf2" podUID="57a39bd8-edb8-4744-bb08-2fc44760608a" containerName="barbican-api-log" containerID="cri-o://2a8594eb86aee7b4010ece44360e831016a772f6c8b4c3784e6e296719b93825" gracePeriod=30 Jan 31 09:45:32 crc kubenswrapper[4992]: I0131 09:45:32.547506 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-76ff5b49fd-jljf2" podUID="57a39bd8-edb8-4744-bb08-2fc44760608a" containerName="barbican-api" containerID="cri-o://d5505953e505e2daa4904c4a87c2192e7dada42b708ee67c12c721cb3675d953" gracePeriod=30 Jan 31 09:45:32 crc kubenswrapper[4992]: I0131 09:45:32.830885 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-65f6bf6f54-x2b8z" podUID="04ff2a8b-a743-475e-9ae5-5fb98839ba57" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.140:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.140:8443: connect: connection refused" Jan 31 09:45:32 crc kubenswrapper[4992]: I0131 09:45:32.923024 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 31 09:45:32 crc kubenswrapper[4992]: I0131 09:45:32.986147 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"233c7d9d-d8e2-456d-8382-bbc880debb01","Type":"ContainerStarted","Data":"25a186a996e5b6c649b22ffc3b5bce012163664b0ef70a0b81871198febd334e"} Jan 31 09:45:33 crc kubenswrapper[4992]: I0131 09:45:33.033597 4992 generic.go:334] "Generic (PLEG): container finished" podID="57a39bd8-edb8-4744-bb08-2fc44760608a" containerID="2a8594eb86aee7b4010ece44360e831016a772f6c8b4c3784e6e296719b93825" exitCode=143 Jan 31 09:45:33 crc kubenswrapper[4992]: I0131 09:45:33.034365 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76ff5b49fd-jljf2" event={"ID":"57a39bd8-edb8-4744-bb08-2fc44760608a","Type":"ContainerDied","Data":"2a8594eb86aee7b4010ece44360e831016a772f6c8b4c3784e6e296719b93825"} Jan 31 09:45:33 crc kubenswrapper[4992]: I0131 09:45:33.197804 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3665c963-d0e3-4317-9bd8-50cc6d7bff5a" path="/var/lib/kubelet/pods/3665c963-d0e3-4317-9bd8-50cc6d7bff5a/volumes" Jan 31 09:45:34 crc kubenswrapper[4992]: I0131 09:45:34.093331 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"233c7d9d-d8e2-456d-8382-bbc880debb01","Type":"ContainerStarted","Data":"d2bbc1875f675082bbf30a8206b924bf1ce5ce7ccca9cdbe7b23b848ac55733f"} Jan 31 09:45:34 crc kubenswrapper[4992]: I0131 09:45:34.122964 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.122942019 podStartE2EDuration="4.122942019s" podCreationTimestamp="2026-01-31 09:45:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:45:34.114468935 +0000 UTC m=+1230.085860942" watchObservedRunningTime="2026-01-31 09:45:34.122942019 +0000 UTC m=+1230.094334006" Jan 31 09:45:35 crc kubenswrapper[4992]: I0131 09:45:35.717126 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-76ff5b49fd-jljf2" podUID="57a39bd8-edb8-4744-bb08-2fc44760608a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": read tcp 10.217.0.2:37644->10.217.0.153:9311: read: connection reset by peer" Jan 31 09:45:35 crc kubenswrapper[4992]: I0131 09:45:35.717134 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-76ff5b49fd-jljf2" podUID="57a39bd8-edb8-4744-bb08-2fc44760608a" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": read tcp 10.217.0.2:37642->10.217.0.153:9311: read: connection reset by peer" Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.124137 4992 generic.go:334] "Generic (PLEG): container finished" podID="57a39bd8-edb8-4744-bb08-2fc44760608a" containerID="d5505953e505e2daa4904c4a87c2192e7dada42b708ee67c12c721cb3675d953" exitCode=0 Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.124181 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76ff5b49fd-jljf2" event={"ID":"57a39bd8-edb8-4744-bb08-2fc44760608a","Type":"ContainerDied","Data":"d5505953e505e2daa4904c4a87c2192e7dada42b708ee67c12c721cb3675d953"} Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.124420 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76ff5b49fd-jljf2" event={"ID":"57a39bd8-edb8-4744-bb08-2fc44760608a","Type":"ContainerDied","Data":"f685a2e72e97238d382e62230f5c20046511adfbdbe31f29700e1d2d03108501"} Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.124447 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f685a2e72e97238d382e62230f5c20046511adfbdbe31f29700e1d2d03108501" Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.171036 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.251216 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89xxq\" (UniqueName: \"kubernetes.io/projected/57a39bd8-edb8-4744-bb08-2fc44760608a-kube-api-access-89xxq\") pod \"57a39bd8-edb8-4744-bb08-2fc44760608a\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.251448 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-config-data\") pod \"57a39bd8-edb8-4744-bb08-2fc44760608a\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.251488 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-combined-ca-bundle\") pod \"57a39bd8-edb8-4744-bb08-2fc44760608a\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.251525 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-config-data-custom\") pod \"57a39bd8-edb8-4744-bb08-2fc44760608a\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.252037 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a39bd8-edb8-4744-bb08-2fc44760608a-logs\") pod \"57a39bd8-edb8-4744-bb08-2fc44760608a\" (UID: \"57a39bd8-edb8-4744-bb08-2fc44760608a\") " Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.253174 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a39bd8-edb8-4744-bb08-2fc44760608a-logs" (OuterVolumeSpecName: "logs") pod "57a39bd8-edb8-4744-bb08-2fc44760608a" (UID: "57a39bd8-edb8-4744-bb08-2fc44760608a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.275597 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "57a39bd8-edb8-4744-bb08-2fc44760608a" (UID: "57a39bd8-edb8-4744-bb08-2fc44760608a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.276904 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a39bd8-edb8-4744-bb08-2fc44760608a-kube-api-access-89xxq" (OuterVolumeSpecName: "kube-api-access-89xxq") pod "57a39bd8-edb8-4744-bb08-2fc44760608a" (UID: "57a39bd8-edb8-4744-bb08-2fc44760608a"). InnerVolumeSpecName "kube-api-access-89xxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.296519 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57a39bd8-edb8-4744-bb08-2fc44760608a" (UID: "57a39bd8-edb8-4744-bb08-2fc44760608a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.324892 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.347830 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-config-data" (OuterVolumeSpecName: "config-data") pod "57a39bd8-edb8-4744-bb08-2fc44760608a" (UID: "57a39bd8-edb8-4744-bb08-2fc44760608a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.354693 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.354721 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.354733 4992 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57a39bd8-edb8-4744-bb08-2fc44760608a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.354742 4992 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57a39bd8-edb8-4744-bb08-2fc44760608a-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.354750 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89xxq\" (UniqueName: \"kubernetes.io/projected/57a39bd8-edb8-4744-bb08-2fc44760608a-kube-api-access-89xxq\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.768073 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:36 crc kubenswrapper[4992]: I0131 09:45:36.770585 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:37 crc kubenswrapper[4992]: I0131 09:45:37.094552 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:37 crc kubenswrapper[4992]: I0131 09:45:37.099253 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7796988564-hnmhv" Jan 31 09:45:37 crc kubenswrapper[4992]: I0131 09:45:37.131731 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76ff5b49fd-jljf2" Jan 31 09:45:37 crc kubenswrapper[4992]: I0131 09:45:37.175865 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-76ff5b49fd-jljf2"] Jan 31 09:45:37 crc kubenswrapper[4992]: I0131 09:45:37.192067 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-76ff5b49fd-jljf2"] Jan 31 09:45:37 crc kubenswrapper[4992]: I0131 09:45:37.206181 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-69d4477cc6-bk8rk"] Jan 31 09:45:37 crc kubenswrapper[4992]: I0131 09:45:37.276067 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-55c8cdc56b-dkph6" Jan 31 09:45:38 crc kubenswrapper[4992]: I0131 09:45:38.140096 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-69d4477cc6-bk8rk" podUID="1fdcaec9-5cd0-4117-bb84-413a80a5860c" containerName="placement-log" containerID="cri-o://9bd66698e5a87bacc917d89d478ec05bf9373a06105c037624c62e4b14fe53bd" gracePeriod=30 Jan 31 09:45:38 crc kubenswrapper[4992]: I0131 09:45:38.140153 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-69d4477cc6-bk8rk" podUID="1fdcaec9-5cd0-4117-bb84-413a80a5860c" containerName="placement-api" containerID="cri-o://0e1384fc19becb01c538d3719e98871bbbcae8ff4c1bb20f7ea2744ce384560e" gracePeriod=30 Jan 31 09:45:39 crc kubenswrapper[4992]: I0131 09:45:39.150087 4992 generic.go:334] "Generic (PLEG): container finished" podID="1fdcaec9-5cd0-4117-bb84-413a80a5860c" containerID="9bd66698e5a87bacc917d89d478ec05bf9373a06105c037624c62e4b14fe53bd" exitCode=143 Jan 31 09:45:39 crc kubenswrapper[4992]: I0131 09:45:39.150125 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69d4477cc6-bk8rk" event={"ID":"1fdcaec9-5cd0-4117-bb84-413a80a5860c","Type":"ContainerDied","Data":"9bd66698e5a87bacc917d89d478ec05bf9373a06105c037624c62e4b14fe53bd"} Jan 31 09:45:39 crc kubenswrapper[4992]: I0131 09:45:39.194472 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a39bd8-edb8-4744-bb08-2fc44760608a" path="/var/lib/kubelet/pods/57a39bd8-edb8-4744-bb08-2fc44760608a/volumes" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.568243 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.691024 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.764351 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-config-data\") pod \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.764467 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-scripts\") pod \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.764567 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fdcaec9-5cd0-4117-bb84-413a80a5860c-logs\") pod \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.764648 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-combined-ca-bundle\") pod \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.764683 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxn6w\" (UniqueName: \"kubernetes.io/projected/1fdcaec9-5cd0-4117-bb84-413a80a5860c-kube-api-access-lxn6w\") pod \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.764709 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-public-tls-certs\") pod \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.764729 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-internal-tls-certs\") pod \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\" (UID: \"1fdcaec9-5cd0-4117-bb84-413a80a5860c\") " Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.766623 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fdcaec9-5cd0-4117-bb84-413a80a5860c-logs" (OuterVolumeSpecName: "logs") pod "1fdcaec9-5cd0-4117-bb84-413a80a5860c" (UID: "1fdcaec9-5cd0-4117-bb84-413a80a5860c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.779295 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-scripts" (OuterVolumeSpecName: "scripts") pod "1fdcaec9-5cd0-4117-bb84-413a80a5860c" (UID: "1fdcaec9-5cd0-4117-bb84-413a80a5860c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.779329 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fdcaec9-5cd0-4117-bb84-413a80a5860c-kube-api-access-lxn6w" (OuterVolumeSpecName: "kube-api-access-lxn6w") pod "1fdcaec9-5cd0-4117-bb84-413a80a5860c" (UID: "1fdcaec9-5cd0-4117-bb84-413a80a5860c"). InnerVolumeSpecName "kube-api-access-lxn6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.852944 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 31 09:45:41 crc kubenswrapper[4992]: E0131 09:45:41.854563 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3665c963-d0e3-4317-9bd8-50cc6d7bff5a" containerName="neutron-api" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.854586 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="3665c963-d0e3-4317-9bd8-50cc6d7bff5a" containerName="neutron-api" Jan 31 09:45:41 crc kubenswrapper[4992]: E0131 09:45:41.854606 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fdcaec9-5cd0-4117-bb84-413a80a5860c" containerName="placement-log" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.854614 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fdcaec9-5cd0-4117-bb84-413a80a5860c" containerName="placement-log" Jan 31 09:45:41 crc kubenswrapper[4992]: E0131 09:45:41.854626 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fdcaec9-5cd0-4117-bb84-413a80a5860c" containerName="placement-api" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.854634 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fdcaec9-5cd0-4117-bb84-413a80a5860c" containerName="placement-api" Jan 31 09:45:41 crc kubenswrapper[4992]: E0131 09:45:41.854651 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a39bd8-edb8-4744-bb08-2fc44760608a" containerName="barbican-api-log" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.854657 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a39bd8-edb8-4744-bb08-2fc44760608a" containerName="barbican-api-log" Jan 31 09:45:41 crc kubenswrapper[4992]: E0131 09:45:41.854667 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3665c963-d0e3-4317-9bd8-50cc6d7bff5a" containerName="neutron-httpd" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.854673 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="3665c963-d0e3-4317-9bd8-50cc6d7bff5a" containerName="neutron-httpd" Jan 31 09:45:41 crc kubenswrapper[4992]: E0131 09:45:41.854698 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a39bd8-edb8-4744-bb08-2fc44760608a" containerName="barbican-api" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.854704 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a39bd8-edb8-4744-bb08-2fc44760608a" containerName="barbican-api" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.854915 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="3665c963-d0e3-4317-9bd8-50cc6d7bff5a" containerName="neutron-api" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.854941 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="3665c963-d0e3-4317-9bd8-50cc6d7bff5a" containerName="neutron-httpd" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.854952 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a39bd8-edb8-4744-bb08-2fc44760608a" containerName="barbican-api-log" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.854963 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a39bd8-edb8-4744-bb08-2fc44760608a" containerName="barbican-api" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.854982 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fdcaec9-5cd0-4117-bb84-413a80a5860c" containerName="placement-log" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.854991 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fdcaec9-5cd0-4117-bb84-413a80a5860c" containerName="placement-api" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.856000 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.861473 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.862216 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.862367 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-config-data" (OuterVolumeSpecName: "config-data") pod "1fdcaec9-5cd0-4117-bb84-413a80a5860c" (UID: "1fdcaec9-5cd0-4117-bb84-413a80a5860c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.862484 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-7mwjr" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.862578 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.868204 4992 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fdcaec9-5cd0-4117-bb84-413a80a5860c-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.868238 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxn6w\" (UniqueName: \"kubernetes.io/projected/1fdcaec9-5cd0-4117-bb84-413a80a5860c-kube-api-access-lxn6w\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.868250 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.868261 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.887673 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fdcaec9-5cd0-4117-bb84-413a80a5860c" (UID: "1fdcaec9-5cd0-4117-bb84-413a80a5860c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.923780 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1fdcaec9-5cd0-4117-bb84-413a80a5860c" (UID: "1fdcaec9-5cd0-4117-bb84-413a80a5860c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.938557 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1fdcaec9-5cd0-4117-bb84-413a80a5860c" (UID: "1fdcaec9-5cd0-4117-bb84-413a80a5860c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.969852 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34f57f1e-1a2d-40dd-8e48-20a1454f1eca-combined-ca-bundle\") pod \"openstackclient\" (UID: \"34f57f1e-1a2d-40dd-8e48-20a1454f1eca\") " pod="openstack/openstackclient" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.969984 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/34f57f1e-1a2d-40dd-8e48-20a1454f1eca-openstack-config-secret\") pod \"openstackclient\" (UID: \"34f57f1e-1a2d-40dd-8e48-20a1454f1eca\") " pod="openstack/openstackclient" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.970008 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/34f57f1e-1a2d-40dd-8e48-20a1454f1eca-openstack-config\") pod \"openstackclient\" (UID: \"34f57f1e-1a2d-40dd-8e48-20a1454f1eca\") " pod="openstack/openstackclient" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.970064 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqmzq\" (UniqueName: \"kubernetes.io/projected/34f57f1e-1a2d-40dd-8e48-20a1454f1eca-kube-api-access-hqmzq\") pod \"openstackclient\" (UID: \"34f57f1e-1a2d-40dd-8e48-20a1454f1eca\") " pod="openstack/openstackclient" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.970170 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.970184 4992 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:41 crc kubenswrapper[4992]: I0131 09:45:41.970194 4992 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdcaec9-5cd0-4117-bb84-413a80a5860c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.072576 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/34f57f1e-1a2d-40dd-8e48-20a1454f1eca-openstack-config-secret\") pod \"openstackclient\" (UID: \"34f57f1e-1a2d-40dd-8e48-20a1454f1eca\") " pod="openstack/openstackclient" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.072656 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/34f57f1e-1a2d-40dd-8e48-20a1454f1eca-openstack-config\") pod \"openstackclient\" (UID: \"34f57f1e-1a2d-40dd-8e48-20a1454f1eca\") " pod="openstack/openstackclient" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.072828 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqmzq\" (UniqueName: \"kubernetes.io/projected/34f57f1e-1a2d-40dd-8e48-20a1454f1eca-kube-api-access-hqmzq\") pod \"openstackclient\" (UID: \"34f57f1e-1a2d-40dd-8e48-20a1454f1eca\") " pod="openstack/openstackclient" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.073451 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/34f57f1e-1a2d-40dd-8e48-20a1454f1eca-openstack-config\") pod \"openstackclient\" (UID: \"34f57f1e-1a2d-40dd-8e48-20a1454f1eca\") " pod="openstack/openstackclient" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.073508 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34f57f1e-1a2d-40dd-8e48-20a1454f1eca-combined-ca-bundle\") pod \"openstackclient\" (UID: \"34f57f1e-1a2d-40dd-8e48-20a1454f1eca\") " pod="openstack/openstackclient" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.076619 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/34f57f1e-1a2d-40dd-8e48-20a1454f1eca-openstack-config-secret\") pod \"openstackclient\" (UID: \"34f57f1e-1a2d-40dd-8e48-20a1454f1eca\") " pod="openstack/openstackclient" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.079407 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34f57f1e-1a2d-40dd-8e48-20a1454f1eca-combined-ca-bundle\") pod \"openstackclient\" (UID: \"34f57f1e-1a2d-40dd-8e48-20a1454f1eca\") " pod="openstack/openstackclient" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.089963 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqmzq\" (UniqueName: \"kubernetes.io/projected/34f57f1e-1a2d-40dd-8e48-20a1454f1eca-kube-api-access-hqmzq\") pod \"openstackclient\" (UID: \"34f57f1e-1a2d-40dd-8e48-20a1454f1eca\") " pod="openstack/openstackclient" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.182789 4992 generic.go:334] "Generic (PLEG): container finished" podID="1fdcaec9-5cd0-4117-bb84-413a80a5860c" containerID="0e1384fc19becb01c538d3719e98871bbbcae8ff4c1bb20f7ea2744ce384560e" exitCode=0 Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.182835 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69d4477cc6-bk8rk" event={"ID":"1fdcaec9-5cd0-4117-bb84-413a80a5860c","Type":"ContainerDied","Data":"0e1384fc19becb01c538d3719e98871bbbcae8ff4c1bb20f7ea2744ce384560e"} Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.182858 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69d4477cc6-bk8rk" event={"ID":"1fdcaec9-5cd0-4117-bb84-413a80a5860c","Type":"ContainerDied","Data":"800587f4aced5b2d899059735c504ca76bdab3abd022f9da1e4b366871f58417"} Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.182878 4992 scope.go:117] "RemoveContainer" containerID="0e1384fc19becb01c538d3719e98871bbbcae8ff4c1bb20f7ea2744ce384560e" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.182907 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69d4477cc6-bk8rk" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.206161 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.225280 4992 scope.go:117] "RemoveContainer" containerID="9bd66698e5a87bacc917d89d478ec05bf9373a06105c037624c62e4b14fe53bd" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.231714 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-69d4477cc6-bk8rk"] Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.240960 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-69d4477cc6-bk8rk"] Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.262125 4992 scope.go:117] "RemoveContainer" containerID="0e1384fc19becb01c538d3719e98871bbbcae8ff4c1bb20f7ea2744ce384560e" Jan 31 09:45:42 crc kubenswrapper[4992]: E0131 09:45:42.263922 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e1384fc19becb01c538d3719e98871bbbcae8ff4c1bb20f7ea2744ce384560e\": container with ID starting with 0e1384fc19becb01c538d3719e98871bbbcae8ff4c1bb20f7ea2744ce384560e not found: ID does not exist" containerID="0e1384fc19becb01c538d3719e98871bbbcae8ff4c1bb20f7ea2744ce384560e" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.263964 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e1384fc19becb01c538d3719e98871bbbcae8ff4c1bb20f7ea2744ce384560e"} err="failed to get container status \"0e1384fc19becb01c538d3719e98871bbbcae8ff4c1bb20f7ea2744ce384560e\": rpc error: code = NotFound desc = could not find container \"0e1384fc19becb01c538d3719e98871bbbcae8ff4c1bb20f7ea2744ce384560e\": container with ID starting with 0e1384fc19becb01c538d3719e98871bbbcae8ff4c1bb20f7ea2744ce384560e not found: ID does not exist" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.263996 4992 scope.go:117] "RemoveContainer" containerID="9bd66698e5a87bacc917d89d478ec05bf9373a06105c037624c62e4b14fe53bd" Jan 31 09:45:42 crc kubenswrapper[4992]: E0131 09:45:42.264532 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd66698e5a87bacc917d89d478ec05bf9373a06105c037624c62e4b14fe53bd\": container with ID starting with 9bd66698e5a87bacc917d89d478ec05bf9373a06105c037624c62e4b14fe53bd not found: ID does not exist" containerID="9bd66698e5a87bacc917d89d478ec05bf9373a06105c037624c62e4b14fe53bd" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.264585 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd66698e5a87bacc917d89d478ec05bf9373a06105c037624c62e4b14fe53bd"} err="failed to get container status \"9bd66698e5a87bacc917d89d478ec05bf9373a06105c037624c62e4b14fe53bd\": rpc error: code = NotFound desc = could not find container \"9bd66698e5a87bacc917d89d478ec05bf9373a06105c037624c62e4b14fe53bd\": container with ID starting with 9bd66698e5a87bacc917d89d478ec05bf9373a06105c037624c62e4b14fe53bd not found: ID does not exist" Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.711173 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 31 09:45:42 crc kubenswrapper[4992]: I0131 09:45:42.829706 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-65f6bf6f54-x2b8z" podUID="04ff2a8b-a743-475e-9ae5-5fb98839ba57" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.140:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.140:8443: connect: connection refused" Jan 31 09:45:43 crc kubenswrapper[4992]: I0131 09:45:43.193921 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fdcaec9-5cd0-4117-bb84-413a80a5860c" path="/var/lib/kubelet/pods/1fdcaec9-5cd0-4117-bb84-413a80a5860c/volumes" Jan 31 09:45:43 crc kubenswrapper[4992]: I0131 09:45:43.195162 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"34f57f1e-1a2d-40dd-8e48-20a1454f1eca","Type":"ContainerStarted","Data":"e1a3594a2c5b48e111d33ccffaa83fde50e9ed691d5a20a6c96835dfa074940f"} Jan 31 09:45:45 crc kubenswrapper[4992]: I0131 09:45:45.301607 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:45:45 crc kubenswrapper[4992]: I0131 09:45:45.302021 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:45:45 crc kubenswrapper[4992]: I0131 09:45:45.302072 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:45:45 crc kubenswrapper[4992]: I0131 09:45:45.302898 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eefc220641844057c58f4645845ce2f51a73e101cb77d772da4c569d245be5c5"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:45:45 crc kubenswrapper[4992]: I0131 09:45:45.302952 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://eefc220641844057c58f4645845ce2f51a73e101cb77d772da4c569d245be5c5" gracePeriod=600 Jan 31 09:45:46 crc kubenswrapper[4992]: I0131 09:45:46.287151 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="eefc220641844057c58f4645845ce2f51a73e101cb77d772da4c569d245be5c5" exitCode=0 Jan 31 09:45:46 crc kubenswrapper[4992]: I0131 09:45:46.287308 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"eefc220641844057c58f4645845ce2f51a73e101cb77d772da4c569d245be5c5"} Jan 31 09:45:46 crc kubenswrapper[4992]: I0131 09:45:46.287556 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"85b7e8954b104f8b7761c24a9e9d822599579a66efc36412ab6a9d3f1890fe38"} Jan 31 09:45:46 crc kubenswrapper[4992]: I0131 09:45:46.287586 4992 scope.go:117] "RemoveContainer" containerID="56fd2e562c473f9f02a32edbe3694b09ca6daec109306548ace480ef8bb463a3" Jan 31 09:45:50 crc kubenswrapper[4992]: I0131 09:45:50.397041 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 31 09:45:51 crc kubenswrapper[4992]: I0131 09:45:51.456548 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:51 crc kubenswrapper[4992]: I0131 09:45:51.457107 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="ceilometer-central-agent" containerID="cri-o://c5713225fc4742aa7bf5f7dc9fd4d743019b6d1ff29be04392a166abfbe31f6f" gracePeriod=30 Jan 31 09:45:51 crc kubenswrapper[4992]: I0131 09:45:51.457180 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="sg-core" containerID="cri-o://daf45301efae7bb2636169a5409b6d65405808aa0b990465021317fe834cbed4" gracePeriod=30 Jan 31 09:45:51 crc kubenswrapper[4992]: I0131 09:45:51.457185 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="ceilometer-notification-agent" containerID="cri-o://a556856d552f716e821fdb5d0dfd3783593b935d8ae65c8e42130b15389ba929" gracePeriod=30 Jan 31 09:45:51 crc kubenswrapper[4992]: I0131 09:45:51.457178 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="proxy-httpd" containerID="cri-o://470ba378c46f75f9f0bf00451ea853ee74ca3376c2d3ce8f1e25fefa85591630" gracePeriod=30 Jan 31 09:45:52 crc kubenswrapper[4992]: I0131 09:45:52.353461 4992 generic.go:334] "Generic (PLEG): container finished" podID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerID="470ba378c46f75f9f0bf00451ea853ee74ca3376c2d3ce8f1e25fefa85591630" exitCode=0 Jan 31 09:45:52 crc kubenswrapper[4992]: I0131 09:45:52.353893 4992 generic.go:334] "Generic (PLEG): container finished" podID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerID="daf45301efae7bb2636169a5409b6d65405808aa0b990465021317fe834cbed4" exitCode=2 Jan 31 09:45:52 crc kubenswrapper[4992]: I0131 09:45:52.353905 4992 generic.go:334] "Generic (PLEG): container finished" podID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerID="c5713225fc4742aa7bf5f7dc9fd4d743019b6d1ff29be04392a166abfbe31f6f" exitCode=0 Jan 31 09:45:52 crc kubenswrapper[4992]: I0131 09:45:52.353835 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9234f7bc-5176-4c28-9ad0-e7f3e41d4935","Type":"ContainerDied","Data":"470ba378c46f75f9f0bf00451ea853ee74ca3376c2d3ce8f1e25fefa85591630"} Jan 31 09:45:52 crc kubenswrapper[4992]: I0131 09:45:52.353994 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9234f7bc-5176-4c28-9ad0-e7f3e41d4935","Type":"ContainerDied","Data":"daf45301efae7bb2636169a5409b6d65405808aa0b990465021317fe834cbed4"} Jan 31 09:45:52 crc kubenswrapper[4992]: I0131 09:45:52.354018 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9234f7bc-5176-4c28-9ad0-e7f3e41d4935","Type":"ContainerDied","Data":"c5713225fc4742aa7bf5f7dc9fd4d743019b6d1ff29be04392a166abfbe31f6f"} Jan 31 09:45:52 crc kubenswrapper[4992]: I0131 09:45:52.831311 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-65f6bf6f54-x2b8z" podUID="04ff2a8b-a743-475e-9ae5-5fb98839ba57" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.140:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.140:8443: connect: connection refused" Jan 31 09:45:52 crc kubenswrapper[4992]: I0131 09:45:52.831528 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.363922 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"34f57f1e-1a2d-40dd-8e48-20a1454f1eca","Type":"ContainerStarted","Data":"cc4004a5d72ac8e641ad3b7b69e35625c3cad6d84ead7378d5fea0a880b48ad3"} Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.383929 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.232264369 podStartE2EDuration="12.38390308s" podCreationTimestamp="2026-01-31 09:45:41 +0000 UTC" firstStartedPulling="2026-01-31 09:45:42.719175012 +0000 UTC m=+1238.690566999" lastFinishedPulling="2026-01-31 09:45:52.870813723 +0000 UTC m=+1248.842205710" observedRunningTime="2026-01-31 09:45:53.382708187 +0000 UTC m=+1249.354100174" watchObservedRunningTime="2026-01-31 09:45:53.38390308 +0000 UTC m=+1249.355295067" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.614777 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-2ngx9"] Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.616063 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2ngx9" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.647627 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2ngx9"] Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.711018 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0-operator-scripts\") pod \"nova-api-db-create-2ngx9\" (UID: \"9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0\") " pod="openstack/nova-api-db-create-2ngx9" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.711071 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhwr8\" (UniqueName: \"kubernetes.io/projected/9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0-kube-api-access-xhwr8\") pod \"nova-api-db-create-2ngx9\" (UID: \"9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0\") " pod="openstack/nova-api-db-create-2ngx9" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.719576 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-vlvb9"] Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.720853 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vlvb9" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.732692 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-3ffc-account-create-update-jxhx5"] Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.733942 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3ffc-account-create-update-jxhx5" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.738782 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.757667 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vlvb9"] Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.778615 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3ffc-account-create-update-jxhx5"] Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.814566 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2658f892-db13-4d57-96eb-dcf80264e0f7-operator-scripts\") pod \"nova-api-3ffc-account-create-update-jxhx5\" (UID: \"2658f892-db13-4d57-96eb-dcf80264e0f7\") " pod="openstack/nova-api-3ffc-account-create-update-jxhx5" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.814620 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhwr8\" (UniqueName: \"kubernetes.io/projected/9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0-kube-api-access-xhwr8\") pod \"nova-api-db-create-2ngx9\" (UID: \"9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0\") " pod="openstack/nova-api-db-create-2ngx9" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.814642 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0-operator-scripts\") pod \"nova-api-db-create-2ngx9\" (UID: \"9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0\") " pod="openstack/nova-api-db-create-2ngx9" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.814694 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6187e6e-51fe-4cb2-a042-afce69a45d6b-operator-scripts\") pod \"nova-cell0-db-create-vlvb9\" (UID: \"a6187e6e-51fe-4cb2-a042-afce69a45d6b\") " pod="openstack/nova-cell0-db-create-vlvb9" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.814751 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq9gb\" (UniqueName: \"kubernetes.io/projected/2658f892-db13-4d57-96eb-dcf80264e0f7-kube-api-access-qq9gb\") pod \"nova-api-3ffc-account-create-update-jxhx5\" (UID: \"2658f892-db13-4d57-96eb-dcf80264e0f7\") " pod="openstack/nova-api-3ffc-account-create-update-jxhx5" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.814793 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwp6f\" (UniqueName: \"kubernetes.io/projected/a6187e6e-51fe-4cb2-a042-afce69a45d6b-kube-api-access-qwp6f\") pod \"nova-cell0-db-create-vlvb9\" (UID: \"a6187e6e-51fe-4cb2-a042-afce69a45d6b\") " pod="openstack/nova-cell0-db-create-vlvb9" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.815596 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0-operator-scripts\") pod \"nova-api-db-create-2ngx9\" (UID: \"9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0\") " pod="openstack/nova-api-db-create-2ngx9" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.830871 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhwr8\" (UniqueName: \"kubernetes.io/projected/9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0-kube-api-access-xhwr8\") pod \"nova-api-db-create-2ngx9\" (UID: \"9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0\") " pod="openstack/nova-api-db-create-2ngx9" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.916319 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6187e6e-51fe-4cb2-a042-afce69a45d6b-operator-scripts\") pod \"nova-cell0-db-create-vlvb9\" (UID: \"a6187e6e-51fe-4cb2-a042-afce69a45d6b\") " pod="openstack/nova-cell0-db-create-vlvb9" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.917075 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6187e6e-51fe-4cb2-a042-afce69a45d6b-operator-scripts\") pod \"nova-cell0-db-create-vlvb9\" (UID: \"a6187e6e-51fe-4cb2-a042-afce69a45d6b\") " pod="openstack/nova-cell0-db-create-vlvb9" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.917855 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq9gb\" (UniqueName: \"kubernetes.io/projected/2658f892-db13-4d57-96eb-dcf80264e0f7-kube-api-access-qq9gb\") pod \"nova-api-3ffc-account-create-update-jxhx5\" (UID: \"2658f892-db13-4d57-96eb-dcf80264e0f7\") " pod="openstack/nova-api-3ffc-account-create-update-jxhx5" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.918002 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwp6f\" (UniqueName: \"kubernetes.io/projected/a6187e6e-51fe-4cb2-a042-afce69a45d6b-kube-api-access-qwp6f\") pod \"nova-cell0-db-create-vlvb9\" (UID: \"a6187e6e-51fe-4cb2-a042-afce69a45d6b\") " pod="openstack/nova-cell0-db-create-vlvb9" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.918277 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2658f892-db13-4d57-96eb-dcf80264e0f7-operator-scripts\") pod \"nova-api-3ffc-account-create-update-jxhx5\" (UID: \"2658f892-db13-4d57-96eb-dcf80264e0f7\") " pod="openstack/nova-api-3ffc-account-create-update-jxhx5" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.918802 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2658f892-db13-4d57-96eb-dcf80264e0f7-operator-scripts\") pod \"nova-api-3ffc-account-create-update-jxhx5\" (UID: \"2658f892-db13-4d57-96eb-dcf80264e0f7\") " pod="openstack/nova-api-3ffc-account-create-update-jxhx5" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.921062 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-zxdvt"] Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.922148 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zxdvt" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.936211 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2ngx9" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.950107 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq9gb\" (UniqueName: \"kubernetes.io/projected/2658f892-db13-4d57-96eb-dcf80264e0f7-kube-api-access-qq9gb\") pod \"nova-api-3ffc-account-create-update-jxhx5\" (UID: \"2658f892-db13-4d57-96eb-dcf80264e0f7\") " pod="openstack/nova-api-3ffc-account-create-update-jxhx5" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.951577 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-200a-account-create-update-5x5ln"] Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.952630 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-200a-account-create-update-5x5ln" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.954969 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.961958 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zxdvt"] Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.969034 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-200a-account-create-update-5x5ln"] Jan 31 09:45:53 crc kubenswrapper[4992]: I0131 09:45:53.973057 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwp6f\" (UniqueName: \"kubernetes.io/projected/a6187e6e-51fe-4cb2-a042-afce69a45d6b-kube-api-access-qwp6f\") pod \"nova-cell0-db-create-vlvb9\" (UID: \"a6187e6e-51fe-4cb2-a042-afce69a45d6b\") " pod="openstack/nova-cell0-db-create-vlvb9" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.020202 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb2z6\" (UniqueName: \"kubernetes.io/projected/95c2c6d9-ec19-4eae-aece-08798fa4fc95-kube-api-access-cb2z6\") pod \"nova-cell1-db-create-zxdvt\" (UID: \"95c2c6d9-ec19-4eae-aece-08798fa4fc95\") " pod="openstack/nova-cell1-db-create-zxdvt" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.020269 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d4b5e45-ca2f-4db6-add4-0b395981b5cd-operator-scripts\") pod \"nova-cell0-200a-account-create-update-5x5ln\" (UID: \"4d4b5e45-ca2f-4db6-add4-0b395981b5cd\") " pod="openstack/nova-cell0-200a-account-create-update-5x5ln" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.020303 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmrdt\" (UniqueName: \"kubernetes.io/projected/4d4b5e45-ca2f-4db6-add4-0b395981b5cd-kube-api-access-rmrdt\") pod \"nova-cell0-200a-account-create-update-5x5ln\" (UID: \"4d4b5e45-ca2f-4db6-add4-0b395981b5cd\") " pod="openstack/nova-cell0-200a-account-create-update-5x5ln" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.020327 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95c2c6d9-ec19-4eae-aece-08798fa4fc95-operator-scripts\") pod \"nova-cell1-db-create-zxdvt\" (UID: \"95c2c6d9-ec19-4eae-aece-08798fa4fc95\") " pod="openstack/nova-cell1-db-create-zxdvt" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.035512 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vlvb9" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.050248 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-bd9b-account-create-update-6msfr"] Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.051252 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bd9b-account-create-update-6msfr" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.051562 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3ffc-account-create-update-jxhx5" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.053284 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.084519 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-bd9b-account-create-update-6msfr"] Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.122697 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d4b5e45-ca2f-4db6-add4-0b395981b5cd-operator-scripts\") pod \"nova-cell0-200a-account-create-update-5x5ln\" (UID: \"4d4b5e45-ca2f-4db6-add4-0b395981b5cd\") " pod="openstack/nova-cell0-200a-account-create-update-5x5ln" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.122757 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmrdt\" (UniqueName: \"kubernetes.io/projected/4d4b5e45-ca2f-4db6-add4-0b395981b5cd-kube-api-access-rmrdt\") pod \"nova-cell0-200a-account-create-update-5x5ln\" (UID: \"4d4b5e45-ca2f-4db6-add4-0b395981b5cd\") " pod="openstack/nova-cell0-200a-account-create-update-5x5ln" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.122782 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95c2c6d9-ec19-4eae-aece-08798fa4fc95-operator-scripts\") pod \"nova-cell1-db-create-zxdvt\" (UID: \"95c2c6d9-ec19-4eae-aece-08798fa4fc95\") " pod="openstack/nova-cell1-db-create-zxdvt" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.122809 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e24bf444-c444-4559-af1a-5bd38f2ce48d-operator-scripts\") pod \"nova-cell1-bd9b-account-create-update-6msfr\" (UID: \"e24bf444-c444-4559-af1a-5bd38f2ce48d\") " pod="openstack/nova-cell1-bd9b-account-create-update-6msfr" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.122830 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2bf4\" (UniqueName: \"kubernetes.io/projected/e24bf444-c444-4559-af1a-5bd38f2ce48d-kube-api-access-s2bf4\") pod \"nova-cell1-bd9b-account-create-update-6msfr\" (UID: \"e24bf444-c444-4559-af1a-5bd38f2ce48d\") " pod="openstack/nova-cell1-bd9b-account-create-update-6msfr" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.122995 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb2z6\" (UniqueName: \"kubernetes.io/projected/95c2c6d9-ec19-4eae-aece-08798fa4fc95-kube-api-access-cb2z6\") pod \"nova-cell1-db-create-zxdvt\" (UID: \"95c2c6d9-ec19-4eae-aece-08798fa4fc95\") " pod="openstack/nova-cell1-db-create-zxdvt" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.128985 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d4b5e45-ca2f-4db6-add4-0b395981b5cd-operator-scripts\") pod \"nova-cell0-200a-account-create-update-5x5ln\" (UID: \"4d4b5e45-ca2f-4db6-add4-0b395981b5cd\") " pod="openstack/nova-cell0-200a-account-create-update-5x5ln" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.129889 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95c2c6d9-ec19-4eae-aece-08798fa4fc95-operator-scripts\") pod \"nova-cell1-db-create-zxdvt\" (UID: \"95c2c6d9-ec19-4eae-aece-08798fa4fc95\") " pod="openstack/nova-cell1-db-create-zxdvt" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.153344 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb2z6\" (UniqueName: \"kubernetes.io/projected/95c2c6d9-ec19-4eae-aece-08798fa4fc95-kube-api-access-cb2z6\") pod \"nova-cell1-db-create-zxdvt\" (UID: \"95c2c6d9-ec19-4eae-aece-08798fa4fc95\") " pod="openstack/nova-cell1-db-create-zxdvt" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.163002 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmrdt\" (UniqueName: \"kubernetes.io/projected/4d4b5e45-ca2f-4db6-add4-0b395981b5cd-kube-api-access-rmrdt\") pod \"nova-cell0-200a-account-create-update-5x5ln\" (UID: \"4d4b5e45-ca2f-4db6-add4-0b395981b5cd\") " pod="openstack/nova-cell0-200a-account-create-update-5x5ln" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.226390 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e24bf444-c444-4559-af1a-5bd38f2ce48d-operator-scripts\") pod \"nova-cell1-bd9b-account-create-update-6msfr\" (UID: \"e24bf444-c444-4559-af1a-5bd38f2ce48d\") " pod="openstack/nova-cell1-bd9b-account-create-update-6msfr" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.226453 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2bf4\" (UniqueName: \"kubernetes.io/projected/e24bf444-c444-4559-af1a-5bd38f2ce48d-kube-api-access-s2bf4\") pod \"nova-cell1-bd9b-account-create-update-6msfr\" (UID: \"e24bf444-c444-4559-af1a-5bd38f2ce48d\") " pod="openstack/nova-cell1-bd9b-account-create-update-6msfr" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.227290 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e24bf444-c444-4559-af1a-5bd38f2ce48d-operator-scripts\") pod \"nova-cell1-bd9b-account-create-update-6msfr\" (UID: \"e24bf444-c444-4559-af1a-5bd38f2ce48d\") " pod="openstack/nova-cell1-bd9b-account-create-update-6msfr" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.247920 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zxdvt" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.248713 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2bf4\" (UniqueName: \"kubernetes.io/projected/e24bf444-c444-4559-af1a-5bd38f2ce48d-kube-api-access-s2bf4\") pod \"nova-cell1-bd9b-account-create-update-6msfr\" (UID: \"e24bf444-c444-4559-af1a-5bd38f2ce48d\") " pod="openstack/nova-cell1-bd9b-account-create-update-6msfr" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.453764 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-200a-account-create-update-5x5ln" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.462297 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bd9b-account-create-update-6msfr" Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.492359 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2ngx9"] Jan 31 09:45:54 crc kubenswrapper[4992]: W0131 09:45:54.504151 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b1cf4c0_06a8_470c_bdb8_7ab0989a94f0.slice/crio-78bfb369aeca821d5894aa6a143e7d56dd03705b1338292a92b9ec23d295ab71 WatchSource:0}: Error finding container 78bfb369aeca821d5894aa6a143e7d56dd03705b1338292a92b9ec23d295ab71: Status 404 returned error can't find the container with id 78bfb369aeca821d5894aa6a143e7d56dd03705b1338292a92b9ec23d295ab71 Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.678365 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vlvb9"] Jan 31 09:45:54 crc kubenswrapper[4992]: W0131 09:45:54.738913 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2658f892_db13_4d57_96eb_dcf80264e0f7.slice/crio-5b1030bcc2955330d46084af24166f142a7ecce595cbce36347545192965d01c WatchSource:0}: Error finding container 5b1030bcc2955330d46084af24166f142a7ecce595cbce36347545192965d01c: Status 404 returned error can't find the container with id 5b1030bcc2955330d46084af24166f142a7ecce595cbce36347545192965d01c Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.743344 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3ffc-account-create-update-jxhx5"] Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.855342 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zxdvt"] Jan 31 09:45:54 crc kubenswrapper[4992]: W0131 09:45:54.888428 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95c2c6d9_ec19_4eae_aece_08798fa4fc95.slice/crio-d4db781b5a33b98671b53007e4700ee9f66fab97fdb09783514acc3d0c52b635 WatchSource:0}: Error finding container d4db781b5a33b98671b53007e4700ee9f66fab97fdb09783514acc3d0c52b635: Status 404 returned error can't find the container with id d4db781b5a33b98671b53007e4700ee9f66fab97fdb09783514acc3d0c52b635 Jan 31 09:45:54 crc kubenswrapper[4992]: I0131 09:45:54.945595 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-bd9b-account-create-update-6msfr"] Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.069556 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-200a-account-create-update-5x5ln"] Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.156768 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.249558 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-run-httpd\") pod \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.249702 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-combined-ca-bundle\") pod \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.249760 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-sg-core-conf-yaml\") pod \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.249822 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-log-httpd\") pod \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.249880 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-scripts\") pod \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.249907 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-config-data\") pod \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.249942 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsjht\" (UniqueName: \"kubernetes.io/projected/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-kube-api-access-lsjht\") pod \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\" (UID: \"9234f7bc-5176-4c28-9ad0-e7f3e41d4935\") " Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.250332 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9234f7bc-5176-4c28-9ad0-e7f3e41d4935" (UID: "9234f7bc-5176-4c28-9ad0-e7f3e41d4935"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.250658 4992 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.252951 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9234f7bc-5176-4c28-9ad0-e7f3e41d4935" (UID: "9234f7bc-5176-4c28-9ad0-e7f3e41d4935"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.260117 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-kube-api-access-lsjht" (OuterVolumeSpecName: "kube-api-access-lsjht") pod "9234f7bc-5176-4c28-9ad0-e7f3e41d4935" (UID: "9234f7bc-5176-4c28-9ad0-e7f3e41d4935"). InnerVolumeSpecName "kube-api-access-lsjht". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.267846 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-scripts" (OuterVolumeSpecName: "scripts") pod "9234f7bc-5176-4c28-9ad0-e7f3e41d4935" (UID: "9234f7bc-5176-4c28-9ad0-e7f3e41d4935"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.308276 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9234f7bc-5176-4c28-9ad0-e7f3e41d4935" (UID: "9234f7bc-5176-4c28-9ad0-e7f3e41d4935"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.353211 4992 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.353247 4992 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.353263 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.353277 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsjht\" (UniqueName: \"kubernetes.io/projected/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-kube-api-access-lsjht\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.375552 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9234f7bc-5176-4c28-9ad0-e7f3e41d4935" (UID: "9234f7bc-5176-4c28-9ad0-e7f3e41d4935"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.393937 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-config-data" (OuterVolumeSpecName: "config-data") pod "9234f7bc-5176-4c28-9ad0-e7f3e41d4935" (UID: "9234f7bc-5176-4c28-9ad0-e7f3e41d4935"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.394314 4992 generic.go:334] "Generic (PLEG): container finished" podID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerID="a556856d552f716e821fdb5d0dfd3783593b935d8ae65c8e42130b15389ba929" exitCode=0 Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.394396 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.394448 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9234f7bc-5176-4c28-9ad0-e7f3e41d4935","Type":"ContainerDied","Data":"a556856d552f716e821fdb5d0dfd3783593b935d8ae65c8e42130b15389ba929"} Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.394483 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9234f7bc-5176-4c28-9ad0-e7f3e41d4935","Type":"ContainerDied","Data":"b8d6d162f776eb5e13eb352dcace54917976b629ce59636ecfda2db36d3d2fd4"} Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.394507 4992 scope.go:117] "RemoveContainer" containerID="470ba378c46f75f9f0bf00451ea853ee74ca3376c2d3ce8f1e25fefa85591630" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.401536 4992 generic.go:334] "Generic (PLEG): container finished" podID="9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0" containerID="9cfc3078597d8fd6b6bf9b55f515da2027be558e8fcdc92345a0cb093ef49c5c" exitCode=0 Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.401685 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2ngx9" event={"ID":"9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0","Type":"ContainerDied","Data":"9cfc3078597d8fd6b6bf9b55f515da2027be558e8fcdc92345a0cb093ef49c5c"} Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.401743 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2ngx9" event={"ID":"9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0","Type":"ContainerStarted","Data":"78bfb369aeca821d5894aa6a143e7d56dd03705b1338292a92b9ec23d295ab71"} Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.408019 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zxdvt" event={"ID":"95c2c6d9-ec19-4eae-aece-08798fa4fc95","Type":"ContainerStarted","Data":"e51a95646f638bd153f86790269eac27001728c4df8f42e79a04f66dea9f2a0e"} Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.408071 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zxdvt" event={"ID":"95c2c6d9-ec19-4eae-aece-08798fa4fc95","Type":"ContainerStarted","Data":"d4db781b5a33b98671b53007e4700ee9f66fab97fdb09783514acc3d0c52b635"} Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.413203 4992 generic.go:334] "Generic (PLEG): container finished" podID="2658f892-db13-4d57-96eb-dcf80264e0f7" containerID="a2facb1456cb995404008ac2aa01f932fce3c63d74bff74cfe6cdd1fd9884e5a" exitCode=0 Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.413255 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3ffc-account-create-update-jxhx5" event={"ID":"2658f892-db13-4d57-96eb-dcf80264e0f7","Type":"ContainerDied","Data":"a2facb1456cb995404008ac2aa01f932fce3c63d74bff74cfe6cdd1fd9884e5a"} Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.413320 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3ffc-account-create-update-jxhx5" event={"ID":"2658f892-db13-4d57-96eb-dcf80264e0f7","Type":"ContainerStarted","Data":"5b1030bcc2955330d46084af24166f142a7ecce595cbce36347545192965d01c"} Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.418704 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-98d958769-wtkh9" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.422392 4992 generic.go:334] "Generic (PLEG): container finished" podID="a6187e6e-51fe-4cb2-a042-afce69a45d6b" containerID="1001ba270c8abf485e4825fe15c22c6d60928c6894927fafaeb7880b867c4f06" exitCode=0 Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.422656 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vlvb9" event={"ID":"a6187e6e-51fe-4cb2-a042-afce69a45d6b","Type":"ContainerDied","Data":"1001ba270c8abf485e4825fe15c22c6d60928c6894927fafaeb7880b867c4f06"} Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.422686 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vlvb9" event={"ID":"a6187e6e-51fe-4cb2-a042-afce69a45d6b","Type":"ContainerStarted","Data":"952ad82f24d3ced87a48c8d04911f41addd4e374a2b586c60a1f9426474918ef"} Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.425612 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-200a-account-create-update-5x5ln" event={"ID":"4d4b5e45-ca2f-4db6-add4-0b395981b5cd","Type":"ContainerStarted","Data":"824907cd7b737411a0241f2f59f7d02d5aa3728882f84a81163d3e098f2793ec"} Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.425652 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-200a-account-create-update-5x5ln" event={"ID":"4d4b5e45-ca2f-4db6-add4-0b395981b5cd","Type":"ContainerStarted","Data":"22111623e3b451fdb57216167d7f528fd384c98780a2f27fec9be534fc6f556b"} Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.430003 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bd9b-account-create-update-6msfr" event={"ID":"e24bf444-c444-4559-af1a-5bd38f2ce48d","Type":"ContainerStarted","Data":"8875fbb11a5cd52687d9aba7763c2524a322fbd6ebc2843102a147588ae3bb6d"} Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.430029 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bd9b-account-create-update-6msfr" event={"ID":"e24bf444-c444-4559-af1a-5bd38f2ce48d","Type":"ContainerStarted","Data":"6125ab098245c21480a5f8481cf41f0758fa32bb031abf84b7a88c4fc0c30314"} Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.455652 4992 scope.go:117] "RemoveContainer" containerID="daf45301efae7bb2636169a5409b6d65405808aa0b990465021317fe834cbed4" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.458786 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.458830 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9234f7bc-5176-4c28-9ad0-e7f3e41d4935-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.475460 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-zxdvt" podStartSLOduration=2.475441605 podStartE2EDuration="2.475441605s" podCreationTimestamp="2026-01-31 09:45:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:45:55.466708239 +0000 UTC m=+1251.438100226" watchObservedRunningTime="2026-01-31 09:45:55.475441605 +0000 UTC m=+1251.446833592" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.484193 4992 scope.go:117] "RemoveContainer" containerID="a556856d552f716e821fdb5d0dfd3783593b935d8ae65c8e42130b15389ba929" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.513599 4992 scope.go:117] "RemoveContainer" containerID="c5713225fc4742aa7bf5f7dc9fd4d743019b6d1ff29be04392a166abfbe31f6f" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.517844 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-200a-account-create-update-5x5ln" podStartSLOduration=2.517819999 podStartE2EDuration="2.517819999s" podCreationTimestamp="2026-01-31 09:45:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:45:55.513854168 +0000 UTC m=+1251.485246175" watchObservedRunningTime="2026-01-31 09:45:55.517819999 +0000 UTC m=+1251.489211986" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.572656 4992 scope.go:117] "RemoveContainer" containerID="470ba378c46f75f9f0bf00451ea853ee74ca3376c2d3ce8f1e25fefa85591630" Jan 31 09:45:55 crc kubenswrapper[4992]: E0131 09:45:55.580841 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"470ba378c46f75f9f0bf00451ea853ee74ca3376c2d3ce8f1e25fefa85591630\": container with ID starting with 470ba378c46f75f9f0bf00451ea853ee74ca3376c2d3ce8f1e25fefa85591630 not found: ID does not exist" containerID="470ba378c46f75f9f0bf00451ea853ee74ca3376c2d3ce8f1e25fefa85591630" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.580888 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"470ba378c46f75f9f0bf00451ea853ee74ca3376c2d3ce8f1e25fefa85591630"} err="failed to get container status \"470ba378c46f75f9f0bf00451ea853ee74ca3376c2d3ce8f1e25fefa85591630\": rpc error: code = NotFound desc = could not find container \"470ba378c46f75f9f0bf00451ea853ee74ca3376c2d3ce8f1e25fefa85591630\": container with ID starting with 470ba378c46f75f9f0bf00451ea853ee74ca3376c2d3ce8f1e25fefa85591630 not found: ID does not exist" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.580914 4992 scope.go:117] "RemoveContainer" containerID="daf45301efae7bb2636169a5409b6d65405808aa0b990465021317fe834cbed4" Jan 31 09:45:55 crc kubenswrapper[4992]: E0131 09:45:55.586654 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daf45301efae7bb2636169a5409b6d65405808aa0b990465021317fe834cbed4\": container with ID starting with daf45301efae7bb2636169a5409b6d65405808aa0b990465021317fe834cbed4 not found: ID does not exist" containerID="daf45301efae7bb2636169a5409b6d65405808aa0b990465021317fe834cbed4" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.586699 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daf45301efae7bb2636169a5409b6d65405808aa0b990465021317fe834cbed4"} err="failed to get container status \"daf45301efae7bb2636169a5409b6d65405808aa0b990465021317fe834cbed4\": rpc error: code = NotFound desc = could not find container \"daf45301efae7bb2636169a5409b6d65405808aa0b990465021317fe834cbed4\": container with ID starting with daf45301efae7bb2636169a5409b6d65405808aa0b990465021317fe834cbed4 not found: ID does not exist" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.586728 4992 scope.go:117] "RemoveContainer" containerID="a556856d552f716e821fdb5d0dfd3783593b935d8ae65c8e42130b15389ba929" Jan 31 09:45:55 crc kubenswrapper[4992]: E0131 09:45:55.587813 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a556856d552f716e821fdb5d0dfd3783593b935d8ae65c8e42130b15389ba929\": container with ID starting with a556856d552f716e821fdb5d0dfd3783593b935d8ae65c8e42130b15389ba929 not found: ID does not exist" containerID="a556856d552f716e821fdb5d0dfd3783593b935d8ae65c8e42130b15389ba929" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.587837 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a556856d552f716e821fdb5d0dfd3783593b935d8ae65c8e42130b15389ba929"} err="failed to get container status \"a556856d552f716e821fdb5d0dfd3783593b935d8ae65c8e42130b15389ba929\": rpc error: code = NotFound desc = could not find container \"a556856d552f716e821fdb5d0dfd3783593b935d8ae65c8e42130b15389ba929\": container with ID starting with a556856d552f716e821fdb5d0dfd3783593b935d8ae65c8e42130b15389ba929 not found: ID does not exist" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.587849 4992 scope.go:117] "RemoveContainer" containerID="c5713225fc4742aa7bf5f7dc9fd4d743019b6d1ff29be04392a166abfbe31f6f" Jan 31 09:45:55 crc kubenswrapper[4992]: E0131 09:45:55.591519 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5713225fc4742aa7bf5f7dc9fd4d743019b6d1ff29be04392a166abfbe31f6f\": container with ID starting with c5713225fc4742aa7bf5f7dc9fd4d743019b6d1ff29be04392a166abfbe31f6f not found: ID does not exist" containerID="c5713225fc4742aa7bf5f7dc9fd4d743019b6d1ff29be04392a166abfbe31f6f" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.591545 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5713225fc4742aa7bf5f7dc9fd4d743019b6d1ff29be04392a166abfbe31f6f"} err="failed to get container status \"c5713225fc4742aa7bf5f7dc9fd4d743019b6d1ff29be04392a166abfbe31f6f\": rpc error: code = NotFound desc = could not find container \"c5713225fc4742aa7bf5f7dc9fd4d743019b6d1ff29be04392a166abfbe31f6f\": container with ID starting with c5713225fc4742aa7bf5f7dc9fd4d743019b6d1ff29be04392a166abfbe31f6f not found: ID does not exist" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.607554 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5fd97c9468-24mb7"] Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.607829 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5fd97c9468-24mb7" podUID="c7c5aa9b-d41b-4820-9156-a42c3e79bb38" containerName="neutron-api" containerID="cri-o://05e403d26a756dfab5748e94d7e335757ac7836aefb4ea2bfd8e22d03bf85912" gracePeriod=30 Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.608215 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5fd97c9468-24mb7" podUID="c7c5aa9b-d41b-4820-9156-a42c3e79bb38" containerName="neutron-httpd" containerID="cri-o://c4aa861035566b13a22a319963273e108134e419f8b6ae9c3a02c7a8d8a2ebe3" gracePeriod=30 Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.673781 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-bd9b-account-create-update-6msfr" podStartSLOduration=1.6737577030000002 podStartE2EDuration="1.673757703s" podCreationTimestamp="2026-01-31 09:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:45:55.553405932 +0000 UTC m=+1251.524797919" watchObservedRunningTime="2026-01-31 09:45:55.673757703 +0000 UTC m=+1251.645149690" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.708710 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.716454 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.732015 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:55 crc kubenswrapper[4992]: E0131 09:45:55.732524 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="proxy-httpd" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.732547 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="proxy-httpd" Jan 31 09:45:55 crc kubenswrapper[4992]: E0131 09:45:55.732566 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="ceilometer-central-agent" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.732574 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="ceilometer-central-agent" Jan 31 09:45:55 crc kubenswrapper[4992]: E0131 09:45:55.732590 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="sg-core" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.732598 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="sg-core" Jan 31 09:45:55 crc kubenswrapper[4992]: E0131 09:45:55.732618 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="ceilometer-notification-agent" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.732626 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="ceilometer-notification-agent" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.732809 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="sg-core" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.732822 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="ceilometer-central-agent" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.732835 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="proxy-httpd" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.732850 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" containerName="ceilometer-notification-agent" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.734652 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.738085 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.738279 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.744381 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.861575 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:55 crc kubenswrapper[4992]: E0131 09:45:55.862199 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-pgrtl log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="07d9c68c-bc9b-4385-a80b-6cb319db9185" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.885764 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-scripts\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.885839 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07d9c68c-bc9b-4385-a80b-6cb319db9185-run-httpd\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.885859 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07d9c68c-bc9b-4385-a80b-6cb319db9185-log-httpd\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.885886 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgrtl\" (UniqueName: \"kubernetes.io/projected/07d9c68c-bc9b-4385-a80b-6cb319db9185-kube-api-access-pgrtl\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.885916 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-config-data\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.886583 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.886624 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.987667 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.987814 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-scripts\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.987898 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07d9c68c-bc9b-4385-a80b-6cb319db9185-run-httpd\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.987931 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07d9c68c-bc9b-4385-a80b-6cb319db9185-log-httpd\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.987975 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgrtl\" (UniqueName: \"kubernetes.io/projected/07d9c68c-bc9b-4385-a80b-6cb319db9185-kube-api-access-pgrtl\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.988022 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-config-data\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.988058 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.988462 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07d9c68c-bc9b-4385-a80b-6cb319db9185-log-httpd\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.988606 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07d9c68c-bc9b-4385-a80b-6cb319db9185-run-httpd\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.996232 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-config-data\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:55 crc kubenswrapper[4992]: I0131 09:45:55.996399 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.004819 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.007892 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-scripts\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.008471 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgrtl\" (UniqueName: \"kubernetes.io/projected/07d9c68c-bc9b-4385-a80b-6cb319db9185-kube-api-access-pgrtl\") pod \"ceilometer-0\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " pod="openstack/ceilometer-0" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.444548 4992 generic.go:334] "Generic (PLEG): container finished" podID="c7c5aa9b-d41b-4820-9156-a42c3e79bb38" containerID="c4aa861035566b13a22a319963273e108134e419f8b6ae9c3a02c7a8d8a2ebe3" exitCode=0 Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.444872 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fd97c9468-24mb7" event={"ID":"c7c5aa9b-d41b-4820-9156-a42c3e79bb38","Type":"ContainerDied","Data":"c4aa861035566b13a22a319963273e108134e419f8b6ae9c3a02c7a8d8a2ebe3"} Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.446971 4992 generic.go:334] "Generic (PLEG): container finished" podID="4d4b5e45-ca2f-4db6-add4-0b395981b5cd" containerID="824907cd7b737411a0241f2f59f7d02d5aa3728882f84a81163d3e098f2793ec" exitCode=0 Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.447026 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-200a-account-create-update-5x5ln" event={"ID":"4d4b5e45-ca2f-4db6-add4-0b395981b5cd","Type":"ContainerDied","Data":"824907cd7b737411a0241f2f59f7d02d5aa3728882f84a81163d3e098f2793ec"} Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.450221 4992 generic.go:334] "Generic (PLEG): container finished" podID="e24bf444-c444-4559-af1a-5bd38f2ce48d" containerID="8875fbb11a5cd52687d9aba7763c2524a322fbd6ebc2843102a147588ae3bb6d" exitCode=0 Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.450277 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bd9b-account-create-update-6msfr" event={"ID":"e24bf444-c444-4559-af1a-5bd38f2ce48d","Type":"ContainerDied","Data":"8875fbb11a5cd52687d9aba7763c2524a322fbd6ebc2843102a147588ae3bb6d"} Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.468851 4992 generic.go:334] "Generic (PLEG): container finished" podID="95c2c6d9-ec19-4eae-aece-08798fa4fc95" containerID="e51a95646f638bd153f86790269eac27001728c4df8f42e79a04f66dea9f2a0e" exitCode=0 Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.469094 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zxdvt" event={"ID":"95c2c6d9-ec19-4eae-aece-08798fa4fc95","Type":"ContainerDied","Data":"e51a95646f638bd153f86790269eac27001728c4df8f42e79a04f66dea9f2a0e"} Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.469229 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.485436 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.596544 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-sg-core-conf-yaml\") pod \"07d9c68c-bc9b-4385-a80b-6cb319db9185\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.596586 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07d9c68c-bc9b-4385-a80b-6cb319db9185-run-httpd\") pod \"07d9c68c-bc9b-4385-a80b-6cb319db9185\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.596609 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-config-data\") pod \"07d9c68c-bc9b-4385-a80b-6cb319db9185\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.596623 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07d9c68c-bc9b-4385-a80b-6cb319db9185-log-httpd\") pod \"07d9c68c-bc9b-4385-a80b-6cb319db9185\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.596646 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-combined-ca-bundle\") pod \"07d9c68c-bc9b-4385-a80b-6cb319db9185\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.596783 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgrtl\" (UniqueName: \"kubernetes.io/projected/07d9c68c-bc9b-4385-a80b-6cb319db9185-kube-api-access-pgrtl\") pod \"07d9c68c-bc9b-4385-a80b-6cb319db9185\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.596800 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-scripts\") pod \"07d9c68c-bc9b-4385-a80b-6cb319db9185\" (UID: \"07d9c68c-bc9b-4385-a80b-6cb319db9185\") " Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.601897 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-scripts" (OuterVolumeSpecName: "scripts") pod "07d9c68c-bc9b-4385-a80b-6cb319db9185" (UID: "07d9c68c-bc9b-4385-a80b-6cb319db9185"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.602100 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07d9c68c-bc9b-4385-a80b-6cb319db9185-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "07d9c68c-bc9b-4385-a80b-6cb319db9185" (UID: "07d9c68c-bc9b-4385-a80b-6cb319db9185"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.602099 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-config-data" (OuterVolumeSpecName: "config-data") pod "07d9c68c-bc9b-4385-a80b-6cb319db9185" (UID: "07d9c68c-bc9b-4385-a80b-6cb319db9185"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.603494 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07d9c68c-bc9b-4385-a80b-6cb319db9185-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "07d9c68c-bc9b-4385-a80b-6cb319db9185" (UID: "07d9c68c-bc9b-4385-a80b-6cb319db9185"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.609702 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "07d9c68c-bc9b-4385-a80b-6cb319db9185" (UID: "07d9c68c-bc9b-4385-a80b-6cb319db9185"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.609726 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07d9c68c-bc9b-4385-a80b-6cb319db9185" (UID: "07d9c68c-bc9b-4385-a80b-6cb319db9185"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.616171 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d9c68c-bc9b-4385-a80b-6cb319db9185-kube-api-access-pgrtl" (OuterVolumeSpecName: "kube-api-access-pgrtl") pod "07d9c68c-bc9b-4385-a80b-6cb319db9185" (UID: "07d9c68c-bc9b-4385-a80b-6cb319db9185"). InnerVolumeSpecName "kube-api-access-pgrtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.714730 4992 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.715102 4992 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07d9c68c-bc9b-4385-a80b-6cb319db9185-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.715114 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.715125 4992 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07d9c68c-bc9b-4385-a80b-6cb319db9185-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.715138 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.715151 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgrtl\" (UniqueName: \"kubernetes.io/projected/07d9c68c-bc9b-4385-a80b-6cb319db9185-kube-api-access-pgrtl\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:56 crc kubenswrapper[4992]: I0131 09:45:56.715164 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07d9c68c-bc9b-4385-a80b-6cb319db9185-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.013695 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2ngx9" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.021908 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vlvb9" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.027564 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3ffc-account-create-update-jxhx5" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.098979 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.099211 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1f9647cf-fe89-42c1-bce1-2075d04e658e" containerName="kube-state-metrics" containerID="cri-o://98d32dbd53451f5d135262e1c0b2bd6cc0e8e0df1b42419bb8a5799e1aec476e" gracePeriod=30 Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.121755 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6187e6e-51fe-4cb2-a042-afce69a45d6b-operator-scripts\") pod \"a6187e6e-51fe-4cb2-a042-afce69a45d6b\" (UID: \"a6187e6e-51fe-4cb2-a042-afce69a45d6b\") " Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.121812 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq9gb\" (UniqueName: \"kubernetes.io/projected/2658f892-db13-4d57-96eb-dcf80264e0f7-kube-api-access-qq9gb\") pod \"2658f892-db13-4d57-96eb-dcf80264e0f7\" (UID: \"2658f892-db13-4d57-96eb-dcf80264e0f7\") " Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.121842 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhwr8\" (UniqueName: \"kubernetes.io/projected/9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0-kube-api-access-xhwr8\") pod \"9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0\" (UID: \"9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0\") " Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.121899 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2658f892-db13-4d57-96eb-dcf80264e0f7-operator-scripts\") pod \"2658f892-db13-4d57-96eb-dcf80264e0f7\" (UID: \"2658f892-db13-4d57-96eb-dcf80264e0f7\") " Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.121933 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0-operator-scripts\") pod \"9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0\" (UID: \"9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0\") " Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.122051 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwp6f\" (UniqueName: \"kubernetes.io/projected/a6187e6e-51fe-4cb2-a042-afce69a45d6b-kube-api-access-qwp6f\") pod \"a6187e6e-51fe-4cb2-a042-afce69a45d6b\" (UID: \"a6187e6e-51fe-4cb2-a042-afce69a45d6b\") " Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.122459 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6187e6e-51fe-4cb2-a042-afce69a45d6b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6187e6e-51fe-4cb2-a042-afce69a45d6b" (UID: "a6187e6e-51fe-4cb2-a042-afce69a45d6b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.122779 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2658f892-db13-4d57-96eb-dcf80264e0f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2658f892-db13-4d57-96eb-dcf80264e0f7" (UID: "2658f892-db13-4d57-96eb-dcf80264e0f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.122991 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0" (UID: "9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.125925 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0-kube-api-access-xhwr8" (OuterVolumeSpecName: "kube-api-access-xhwr8") pod "9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0" (UID: "9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0"). InnerVolumeSpecName "kube-api-access-xhwr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.126490 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6187e6e-51fe-4cb2-a042-afce69a45d6b-kube-api-access-qwp6f" (OuterVolumeSpecName: "kube-api-access-qwp6f") pod "a6187e6e-51fe-4cb2-a042-afce69a45d6b" (UID: "a6187e6e-51fe-4cb2-a042-afce69a45d6b"). InnerVolumeSpecName "kube-api-access-qwp6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.126813 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2658f892-db13-4d57-96eb-dcf80264e0f7-kube-api-access-qq9gb" (OuterVolumeSpecName: "kube-api-access-qq9gb") pod "2658f892-db13-4d57-96eb-dcf80264e0f7" (UID: "2658f892-db13-4d57-96eb-dcf80264e0f7"). InnerVolumeSpecName "kube-api-access-qq9gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.192952 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9234f7bc-5176-4c28-9ad0-e7f3e41d4935" path="/var/lib/kubelet/pods/9234f7bc-5176-4c28-9ad0-e7f3e41d4935/volumes" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.224523 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwp6f\" (UniqueName: \"kubernetes.io/projected/a6187e6e-51fe-4cb2-a042-afce69a45d6b-kube-api-access-qwp6f\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.224557 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6187e6e-51fe-4cb2-a042-afce69a45d6b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.224567 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq9gb\" (UniqueName: \"kubernetes.io/projected/2658f892-db13-4d57-96eb-dcf80264e0f7-kube-api-access-qq9gb\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.224576 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhwr8\" (UniqueName: \"kubernetes.io/projected/9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0-kube-api-access-xhwr8\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.224584 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2658f892-db13-4d57-96eb-dcf80264e0f7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.224593 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.478464 4992 generic.go:334] "Generic (PLEG): container finished" podID="1f9647cf-fe89-42c1-bce1-2075d04e658e" containerID="98d32dbd53451f5d135262e1c0b2bd6cc0e8e0df1b42419bb8a5799e1aec476e" exitCode=2 Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.478537 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1f9647cf-fe89-42c1-bce1-2075d04e658e","Type":"ContainerDied","Data":"98d32dbd53451f5d135262e1c0b2bd6cc0e8e0df1b42419bb8a5799e1aec476e"} Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.480482 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2ngx9" event={"ID":"9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0","Type":"ContainerDied","Data":"78bfb369aeca821d5894aa6a143e7d56dd03705b1338292a92b9ec23d295ab71"} Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.480512 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78bfb369aeca821d5894aa6a143e7d56dd03705b1338292a92b9ec23d295ab71" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.480570 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2ngx9" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.483513 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3ffc-account-create-update-jxhx5" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.483623 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3ffc-account-create-update-jxhx5" event={"ID":"2658f892-db13-4d57-96eb-dcf80264e0f7","Type":"ContainerDied","Data":"5b1030bcc2955330d46084af24166f142a7ecce595cbce36347545192965d01c"} Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.483645 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b1030bcc2955330d46084af24166f142a7ecce595cbce36347545192965d01c" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.485133 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.485189 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vlvb9" event={"ID":"a6187e6e-51fe-4cb2-a042-afce69a45d6b","Type":"ContainerDied","Data":"952ad82f24d3ced87a48c8d04911f41addd4e374a2b586c60a1f9426474918ef"} Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.485213 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="952ad82f24d3ced87a48c8d04911f41addd4e374a2b586c60a1f9426474918ef" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.485669 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vlvb9" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.563059 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.570558 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.586844 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.627589 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:57 crc kubenswrapper[4992]: E0131 09:45:57.628915 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0" containerName="mariadb-database-create" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.628945 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0" containerName="mariadb-database-create" Jan 31 09:45:57 crc kubenswrapper[4992]: E0131 09:45:57.628996 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f9647cf-fe89-42c1-bce1-2075d04e658e" containerName="kube-state-metrics" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.629006 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f9647cf-fe89-42c1-bce1-2075d04e658e" containerName="kube-state-metrics" Jan 31 09:45:57 crc kubenswrapper[4992]: E0131 09:45:57.629039 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2658f892-db13-4d57-96eb-dcf80264e0f7" containerName="mariadb-account-create-update" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.629048 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="2658f892-db13-4d57-96eb-dcf80264e0f7" containerName="mariadb-account-create-update" Jan 31 09:45:57 crc kubenswrapper[4992]: E0131 09:45:57.629079 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6187e6e-51fe-4cb2-a042-afce69a45d6b" containerName="mariadb-database-create" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.629087 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6187e6e-51fe-4cb2-a042-afce69a45d6b" containerName="mariadb-database-create" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.631928 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0" containerName="mariadb-database-create" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.631974 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6187e6e-51fe-4cb2-a042-afce69a45d6b" containerName="mariadb-database-create" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.632001 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f9647cf-fe89-42c1-bce1-2075d04e658e" containerName="kube-state-metrics" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.632024 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="2658f892-db13-4d57-96eb-dcf80264e0f7" containerName="mariadb-account-create-update" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.638001 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbkcx\" (UniqueName: \"kubernetes.io/projected/1f9647cf-fe89-42c1-bce1-2075d04e658e-kube-api-access-sbkcx\") pod \"1f9647cf-fe89-42c1-bce1-2075d04e658e\" (UID: \"1f9647cf-fe89-42c1-bce1-2075d04e658e\") " Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.642999 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.643742 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f9647cf-fe89-42c1-bce1-2075d04e658e-kube-api-access-sbkcx" (OuterVolumeSpecName: "kube-api-access-sbkcx") pod "1f9647cf-fe89-42c1-bce1-2075d04e658e" (UID: "1f9647cf-fe89-42c1-bce1-2075d04e658e"). InnerVolumeSpecName "kube-api-access-sbkcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.647827 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.665601 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.666222 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.741742 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-scripts\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.741789 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8baf1ac1-7911-4667-a71d-1d4771c0d408-log-httpd\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.741826 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.741851 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8baf1ac1-7911-4667-a71d-1d4771c0d408-run-httpd\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.741879 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-config-data\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.741967 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxwkx\" (UniqueName: \"kubernetes.io/projected/8baf1ac1-7911-4667-a71d-1d4771c0d408-kube-api-access-dxwkx\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.742304 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.742444 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbkcx\" (UniqueName: \"kubernetes.io/projected/1f9647cf-fe89-42c1-bce1-2075d04e658e-kube-api-access-sbkcx\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.846479 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-config-data\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.846535 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxwkx\" (UniqueName: \"kubernetes.io/projected/8baf1ac1-7911-4667-a71d-1d4771c0d408-kube-api-access-dxwkx\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.846627 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.846661 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-scripts\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.846679 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8baf1ac1-7911-4667-a71d-1d4771c0d408-log-httpd\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.846707 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.846760 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8baf1ac1-7911-4667-a71d-1d4771c0d408-run-httpd\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.847169 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8baf1ac1-7911-4667-a71d-1d4771c0d408-run-httpd\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.848111 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8baf1ac1-7911-4667-a71d-1d4771c0d408-log-httpd\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.851951 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-scripts\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.854610 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.855829 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-config-data\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.860009 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.866183 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxwkx\" (UniqueName: \"kubernetes.io/projected/8baf1ac1-7911-4667-a71d-1d4771c0d408-kube-api-access-dxwkx\") pod \"ceilometer-0\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " pod="openstack/ceilometer-0" Jan 31 09:45:57 crc kubenswrapper[4992]: I0131 09:45:57.979216 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.112661 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bd9b-account-create-update-6msfr" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.135041 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zxdvt" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.136325 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-200a-account-create-update-5x5ln" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.265279 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2bf4\" (UniqueName: \"kubernetes.io/projected/e24bf444-c444-4559-af1a-5bd38f2ce48d-kube-api-access-s2bf4\") pod \"e24bf444-c444-4559-af1a-5bd38f2ce48d\" (UID: \"e24bf444-c444-4559-af1a-5bd38f2ce48d\") " Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.265352 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e24bf444-c444-4559-af1a-5bd38f2ce48d-operator-scripts\") pod \"e24bf444-c444-4559-af1a-5bd38f2ce48d\" (UID: \"e24bf444-c444-4559-af1a-5bd38f2ce48d\") " Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.265403 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmrdt\" (UniqueName: \"kubernetes.io/projected/4d4b5e45-ca2f-4db6-add4-0b395981b5cd-kube-api-access-rmrdt\") pod \"4d4b5e45-ca2f-4db6-add4-0b395981b5cd\" (UID: \"4d4b5e45-ca2f-4db6-add4-0b395981b5cd\") " Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.265454 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d4b5e45-ca2f-4db6-add4-0b395981b5cd-operator-scripts\") pod \"4d4b5e45-ca2f-4db6-add4-0b395981b5cd\" (UID: \"4d4b5e45-ca2f-4db6-add4-0b395981b5cd\") " Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.265576 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95c2c6d9-ec19-4eae-aece-08798fa4fc95-operator-scripts\") pod \"95c2c6d9-ec19-4eae-aece-08798fa4fc95\" (UID: \"95c2c6d9-ec19-4eae-aece-08798fa4fc95\") " Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.265656 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb2z6\" (UniqueName: \"kubernetes.io/projected/95c2c6d9-ec19-4eae-aece-08798fa4fc95-kube-api-access-cb2z6\") pod \"95c2c6d9-ec19-4eae-aece-08798fa4fc95\" (UID: \"95c2c6d9-ec19-4eae-aece-08798fa4fc95\") " Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.266288 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e24bf444-c444-4559-af1a-5bd38f2ce48d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e24bf444-c444-4559-af1a-5bd38f2ce48d" (UID: "e24bf444-c444-4559-af1a-5bd38f2ce48d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.267080 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d4b5e45-ca2f-4db6-add4-0b395981b5cd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d4b5e45-ca2f-4db6-add4-0b395981b5cd" (UID: "4d4b5e45-ca2f-4db6-add4-0b395981b5cd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.267612 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95c2c6d9-ec19-4eae-aece-08798fa4fc95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95c2c6d9-ec19-4eae-aece-08798fa4fc95" (UID: "95c2c6d9-ec19-4eae-aece-08798fa4fc95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.277561 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d4b5e45-ca2f-4db6-add4-0b395981b5cd-kube-api-access-rmrdt" (OuterVolumeSpecName: "kube-api-access-rmrdt") pod "4d4b5e45-ca2f-4db6-add4-0b395981b5cd" (UID: "4d4b5e45-ca2f-4db6-add4-0b395981b5cd"). InnerVolumeSpecName "kube-api-access-rmrdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.277631 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e24bf444-c444-4559-af1a-5bd38f2ce48d-kube-api-access-s2bf4" (OuterVolumeSpecName: "kube-api-access-s2bf4") pod "e24bf444-c444-4559-af1a-5bd38f2ce48d" (UID: "e24bf444-c444-4559-af1a-5bd38f2ce48d"). InnerVolumeSpecName "kube-api-access-s2bf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.277676 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c2c6d9-ec19-4eae-aece-08798fa4fc95-kube-api-access-cb2z6" (OuterVolumeSpecName: "kube-api-access-cb2z6") pod "95c2c6d9-ec19-4eae-aece-08798fa4fc95" (UID: "95c2c6d9-ec19-4eae-aece-08798fa4fc95"). InnerVolumeSpecName "kube-api-access-cb2z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.310839 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.362461 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.367454 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2bf4\" (UniqueName: \"kubernetes.io/projected/e24bf444-c444-4559-af1a-5bd38f2ce48d-kube-api-access-s2bf4\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.367480 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e24bf444-c444-4559-af1a-5bd38f2ce48d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.367490 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmrdt\" (UniqueName: \"kubernetes.io/projected/4d4b5e45-ca2f-4db6-add4-0b395981b5cd-kube-api-access-rmrdt\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.367499 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d4b5e45-ca2f-4db6-add4-0b395981b5cd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.367510 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95c2c6d9-ec19-4eae-aece-08798fa4fc95-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.367518 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb2z6\" (UniqueName: \"kubernetes.io/projected/95c2c6d9-ec19-4eae-aece-08798fa4fc95-kube-api-access-cb2z6\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.468728 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-combined-ca-bundle\") pod \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.468800 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-horizon-tls-certs\") pod \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.468879 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmlxg\" (UniqueName: \"kubernetes.io/projected/04ff2a8b-a743-475e-9ae5-5fb98839ba57-kube-api-access-gmlxg\") pod \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.468908 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04ff2a8b-a743-475e-9ae5-5fb98839ba57-config-data\") pod \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.469013 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04ff2a8b-a743-475e-9ae5-5fb98839ba57-logs\") pod \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.469045 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04ff2a8b-a743-475e-9ae5-5fb98839ba57-scripts\") pod \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.469083 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-horizon-secret-key\") pod \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\" (UID: \"04ff2a8b-a743-475e-9ae5-5fb98839ba57\") " Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.470341 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04ff2a8b-a743-475e-9ae5-5fb98839ba57-logs" (OuterVolumeSpecName: "logs") pod "04ff2a8b-a743-475e-9ae5-5fb98839ba57" (UID: "04ff2a8b-a743-475e-9ae5-5fb98839ba57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.478652 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ff2a8b-a743-475e-9ae5-5fb98839ba57-kube-api-access-gmlxg" (OuterVolumeSpecName: "kube-api-access-gmlxg") pod "04ff2a8b-a743-475e-9ae5-5fb98839ba57" (UID: "04ff2a8b-a743-475e-9ae5-5fb98839ba57"). InnerVolumeSpecName "kube-api-access-gmlxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.479246 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "04ff2a8b-a743-475e-9ae5-5fb98839ba57" (UID: "04ff2a8b-a743-475e-9ae5-5fb98839ba57"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.494232 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04ff2a8b-a743-475e-9ae5-5fb98839ba57-scripts" (OuterVolumeSpecName: "scripts") pod "04ff2a8b-a743-475e-9ae5-5fb98839ba57" (UID: "04ff2a8b-a743-475e-9ae5-5fb98839ba57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.495894 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-200a-account-create-update-5x5ln" event={"ID":"4d4b5e45-ca2f-4db6-add4-0b395981b5cd","Type":"ContainerDied","Data":"22111623e3b451fdb57216167d7f528fd384c98780a2f27fec9be534fc6f556b"} Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.495929 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22111623e3b451fdb57216167d7f528fd384c98780a2f27fec9be534fc6f556b" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.495976 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-200a-account-create-update-5x5ln" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.497663 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04ff2a8b-a743-475e-9ae5-5fb98839ba57-config-data" (OuterVolumeSpecName: "config-data") pod "04ff2a8b-a743-475e-9ae5-5fb98839ba57" (UID: "04ff2a8b-a743-475e-9ae5-5fb98839ba57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.497999 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bd9b-account-create-update-6msfr" event={"ID":"e24bf444-c444-4559-af1a-5bd38f2ce48d","Type":"ContainerDied","Data":"6125ab098245c21480a5f8481cf41f0758fa32bb031abf84b7a88c4fc0c30314"} Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.498029 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6125ab098245c21480a5f8481cf41f0758fa32bb031abf84b7a88c4fc0c30314" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.498046 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bd9b-account-create-update-6msfr" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.502350 4992 generic.go:334] "Generic (PLEG): container finished" podID="04ff2a8b-a743-475e-9ae5-5fb98839ba57" containerID="235244b99bd6545b56a68bbe33b171a8a66da4e113507373bfd664c22bb694dc" exitCode=137 Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.502429 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65f6bf6f54-x2b8z" event={"ID":"04ff2a8b-a743-475e-9ae5-5fb98839ba57","Type":"ContainerDied","Data":"235244b99bd6545b56a68bbe33b171a8a66da4e113507373bfd664c22bb694dc"} Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.502452 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65f6bf6f54-x2b8z" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.502466 4992 scope.go:117] "RemoveContainer" containerID="ca25af36153839640483f7960a573ac8a9e0baf0669dab993d3eee4217e0f72d" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.502456 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65f6bf6f54-x2b8z" event={"ID":"04ff2a8b-a743-475e-9ae5-5fb98839ba57","Type":"ContainerDied","Data":"7d2291052b458b7f0597ada1d75c141764c44393a2ea1260a5b192420e953f59"} Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.503847 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04ff2a8b-a743-475e-9ae5-5fb98839ba57" (UID: "04ff2a8b-a743-475e-9ae5-5fb98839ba57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.504933 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1f9647cf-fe89-42c1-bce1-2075d04e658e","Type":"ContainerDied","Data":"ed72c179099a2218c009f4b2ec941672ba9eb7e2773005062a0b58c391757397"} Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.504998 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.507757 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zxdvt" event={"ID":"95c2c6d9-ec19-4eae-aece-08798fa4fc95","Type":"ContainerDied","Data":"d4db781b5a33b98671b53007e4700ee9f66fab97fdb09783514acc3d0c52b635"} Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.508040 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4db781b5a33b98671b53007e4700ee9f66fab97fdb09783514acc3d0c52b635" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.507869 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zxdvt" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.534316 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "04ff2a8b-a743-475e-9ae5-5fb98839ba57" (UID: "04ff2a8b-a743-475e-9ae5-5fb98839ba57"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.570914 4992 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.571042 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.571143 4992 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ff2a8b-a743-475e-9ae5-5fb98839ba57-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.571230 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmlxg\" (UniqueName: \"kubernetes.io/projected/04ff2a8b-a743-475e-9ae5-5fb98839ba57-kube-api-access-gmlxg\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.571297 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04ff2a8b-a743-475e-9ae5-5fb98839ba57-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.571360 4992 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04ff2a8b-a743-475e-9ae5-5fb98839ba57-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.571432 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04ff2a8b-a743-475e-9ae5-5fb98839ba57-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.584078 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.593669 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.610327 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:45:58 crc kubenswrapper[4992]: E0131 09:45:58.610896 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4b5e45-ca2f-4db6-add4-0b395981b5cd" containerName="mariadb-account-create-update" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.610990 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4b5e45-ca2f-4db6-add4-0b395981b5cd" containerName="mariadb-account-create-update" Jan 31 09:45:58 crc kubenswrapper[4992]: E0131 09:45:58.611062 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ff2a8b-a743-475e-9ae5-5fb98839ba57" containerName="horizon-log" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.611112 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ff2a8b-a743-475e-9ae5-5fb98839ba57" containerName="horizon-log" Jan 31 09:45:58 crc kubenswrapper[4992]: E0131 09:45:58.611199 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ff2a8b-a743-475e-9ae5-5fb98839ba57" containerName="horizon" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.611271 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ff2a8b-a743-475e-9ae5-5fb98839ba57" containerName="horizon" Jan 31 09:45:58 crc kubenswrapper[4992]: E0131 09:45:58.611335 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24bf444-c444-4559-af1a-5bd38f2ce48d" containerName="mariadb-account-create-update" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.611402 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24bf444-c444-4559-af1a-5bd38f2ce48d" containerName="mariadb-account-create-update" Jan 31 09:45:58 crc kubenswrapper[4992]: E0131 09:45:58.611491 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c2c6d9-ec19-4eae-aece-08798fa4fc95" containerName="mariadb-database-create" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.611540 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c2c6d9-ec19-4eae-aece-08798fa4fc95" containerName="mariadb-database-create" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.611741 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c2c6d9-ec19-4eae-aece-08798fa4fc95" containerName="mariadb-database-create" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.611926 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ff2a8b-a743-475e-9ae5-5fb98839ba57" containerName="horizon" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.612016 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d4b5e45-ca2f-4db6-add4-0b395981b5cd" containerName="mariadb-account-create-update" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.612084 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ff2a8b-a743-475e-9ae5-5fb98839ba57" containerName="horizon-log" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.612168 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24bf444-c444-4559-af1a-5bd38f2ce48d" containerName="mariadb-account-create-update" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.612802 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.612981 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.644059 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.644238 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.708668 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.714622 4992 scope.go:117] "RemoveContainer" containerID="235244b99bd6545b56a68bbe33b171a8a66da4e113507373bfd664c22bb694dc" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.762479 4992 scope.go:117] "RemoveContainer" containerID="ca25af36153839640483f7960a573ac8a9e0baf0669dab993d3eee4217e0f72d" Jan 31 09:45:58 crc kubenswrapper[4992]: E0131 09:45:58.763241 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca25af36153839640483f7960a573ac8a9e0baf0669dab993d3eee4217e0f72d\": container with ID starting with ca25af36153839640483f7960a573ac8a9e0baf0669dab993d3eee4217e0f72d not found: ID does not exist" containerID="ca25af36153839640483f7960a573ac8a9e0baf0669dab993d3eee4217e0f72d" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.763283 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca25af36153839640483f7960a573ac8a9e0baf0669dab993d3eee4217e0f72d"} err="failed to get container status \"ca25af36153839640483f7960a573ac8a9e0baf0669dab993d3eee4217e0f72d\": rpc error: code = NotFound desc = could not find container \"ca25af36153839640483f7960a573ac8a9e0baf0669dab993d3eee4217e0f72d\": container with ID starting with ca25af36153839640483f7960a573ac8a9e0baf0669dab993d3eee4217e0f72d not found: ID does not exist" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.763314 4992 scope.go:117] "RemoveContainer" containerID="235244b99bd6545b56a68bbe33b171a8a66da4e113507373bfd664c22bb694dc" Jan 31 09:45:58 crc kubenswrapper[4992]: E0131 09:45:58.763611 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"235244b99bd6545b56a68bbe33b171a8a66da4e113507373bfd664c22bb694dc\": container with ID starting with 235244b99bd6545b56a68bbe33b171a8a66da4e113507373bfd664c22bb694dc not found: ID does not exist" containerID="235244b99bd6545b56a68bbe33b171a8a66da4e113507373bfd664c22bb694dc" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.763634 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"235244b99bd6545b56a68bbe33b171a8a66da4e113507373bfd664c22bb694dc"} err="failed to get container status \"235244b99bd6545b56a68bbe33b171a8a66da4e113507373bfd664c22bb694dc\": rpc error: code = NotFound desc = could not find container \"235244b99bd6545b56a68bbe33b171a8a66da4e113507373bfd664c22bb694dc\": container with ID starting with 235244b99bd6545b56a68bbe33b171a8a66da4e113507373bfd664c22bb694dc not found: ID does not exist" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.763653 4992 scope.go:117] "RemoveContainer" containerID="98d32dbd53451f5d135262e1c0b2bd6cc0e8e0df1b42419bb8a5799e1aec476e" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.775039 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0\") " pod="openstack/kube-state-metrics-0" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.775084 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29fvd\" (UniqueName: \"kubernetes.io/projected/29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0-kube-api-access-29fvd\") pod \"kube-state-metrics-0\" (UID: \"29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0\") " pod="openstack/kube-state-metrics-0" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.775120 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0\") " pod="openstack/kube-state-metrics-0" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.775158 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0\") " pod="openstack/kube-state-metrics-0" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.835540 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-65f6bf6f54-x2b8z"] Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.845245 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-65f6bf6f54-x2b8z"] Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.876750 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0\") " pod="openstack/kube-state-metrics-0" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.876802 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29fvd\" (UniqueName: \"kubernetes.io/projected/29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0-kube-api-access-29fvd\") pod \"kube-state-metrics-0\" (UID: \"29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0\") " pod="openstack/kube-state-metrics-0" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.876829 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0\") " pod="openstack/kube-state-metrics-0" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.876874 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0\") " pod="openstack/kube-state-metrics-0" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.882186 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0\") " pod="openstack/kube-state-metrics-0" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.882242 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0\") " pod="openstack/kube-state-metrics-0" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.884360 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0\") " pod="openstack/kube-state-metrics-0" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.895102 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29fvd\" (UniqueName: \"kubernetes.io/projected/29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0-kube-api-access-29fvd\") pod \"kube-state-metrics-0\" (UID: \"29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0\") " pod="openstack/kube-state-metrics-0" Jan 31 09:45:58 crc kubenswrapper[4992]: I0131 09:45:58.995557 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.128916 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.199813 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ff2a8b-a743-475e-9ae5-5fb98839ba57" path="/var/lib/kubelet/pods/04ff2a8b-a743-475e-9ae5-5fb98839ba57/volumes" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.200517 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d9c68c-bc9b-4385-a80b-6cb319db9185" path="/var/lib/kubelet/pods/07d9c68c-bc9b-4385-a80b-6cb319db9185/volumes" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.200831 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f9647cf-fe89-42c1-bce1-2075d04e658e" path="/var/lib/kubelet/pods/1f9647cf-fe89-42c1-bce1-2075d04e658e/volumes" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.281748 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cwgg\" (UniqueName: \"kubernetes.io/projected/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-kube-api-access-9cwgg\") pod \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.281806 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-httpd-config\") pod \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.281887 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-combined-ca-bundle\") pod \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.281917 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-config\") pod \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.281952 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-ovndb-tls-certs\") pod \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\" (UID: \"c7c5aa9b-d41b-4820-9156-a42c3e79bb38\") " Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.286924 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c7c5aa9b-d41b-4820-9156-a42c3e79bb38" (UID: "c7c5aa9b-d41b-4820-9156-a42c3e79bb38"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.286966 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-kube-api-access-9cwgg" (OuterVolumeSpecName: "kube-api-access-9cwgg") pod "c7c5aa9b-d41b-4820-9156-a42c3e79bb38" (UID: "c7c5aa9b-d41b-4820-9156-a42c3e79bb38"). InnerVolumeSpecName "kube-api-access-9cwgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.329254 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-config" (OuterVolumeSpecName: "config") pod "c7c5aa9b-d41b-4820-9156-a42c3e79bb38" (UID: "c7c5aa9b-d41b-4820-9156-a42c3e79bb38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.331944 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7c5aa9b-d41b-4820-9156-a42c3e79bb38" (UID: "c7c5aa9b-d41b-4820-9156-a42c3e79bb38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.361126 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c7c5aa9b-d41b-4820-9156-a42c3e79bb38" (UID: "c7c5aa9b-d41b-4820-9156-a42c3e79bb38"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.387758 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.387794 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.387805 4992 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.387813 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cwgg\" (UniqueName: \"kubernetes.io/projected/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-kube-api-access-9cwgg\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.387824 4992 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c7c5aa9b-d41b-4820-9156-a42c3e79bb38-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.456310 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.532644 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0","Type":"ContainerStarted","Data":"24d72faaf574ec34268643d8ab05586137c876fba7210ca64dd6d34a954b1a43"} Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.534164 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8baf1ac1-7911-4667-a71d-1d4771c0d408","Type":"ContainerStarted","Data":"78c2af8752a4bb392fd9a625bceba239eaf6474cf431da481590e151c75e7849"} Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.534203 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8baf1ac1-7911-4667-a71d-1d4771c0d408","Type":"ContainerStarted","Data":"6acccfea62d3cc0280abf2212db33b52271a286f68d34e8f359c389aa2b14f32"} Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.537111 4992 generic.go:334] "Generic (PLEG): container finished" podID="c7c5aa9b-d41b-4820-9156-a42c3e79bb38" containerID="05e403d26a756dfab5748e94d7e335757ac7836aefb4ea2bfd8e22d03bf85912" exitCode=0 Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.537221 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fd97c9468-24mb7" event={"ID":"c7c5aa9b-d41b-4820-9156-a42c3e79bb38","Type":"ContainerDied","Data":"05e403d26a756dfab5748e94d7e335757ac7836aefb4ea2bfd8e22d03bf85912"} Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.537279 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fd97c9468-24mb7" event={"ID":"c7c5aa9b-d41b-4820-9156-a42c3e79bb38","Type":"ContainerDied","Data":"fe08127f28f1b17d5d01ca4fce1334a16ebb1af2867c2ccee4bb5fa887a3602f"} Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.537303 4992 scope.go:117] "RemoveContainer" containerID="c4aa861035566b13a22a319963273e108134e419f8b6ae9c3a02c7a8d8a2ebe3" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.537535 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fd97c9468-24mb7" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.596114 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5fd97c9468-24mb7"] Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.596748 4992 scope.go:117] "RemoveContainer" containerID="05e403d26a756dfab5748e94d7e335757ac7836aefb4ea2bfd8e22d03bf85912" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.606025 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5fd97c9468-24mb7"] Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.637616 4992 scope.go:117] "RemoveContainer" containerID="c4aa861035566b13a22a319963273e108134e419f8b6ae9c3a02c7a8d8a2ebe3" Jan 31 09:45:59 crc kubenswrapper[4992]: E0131 09:45:59.638099 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4aa861035566b13a22a319963273e108134e419f8b6ae9c3a02c7a8d8a2ebe3\": container with ID starting with c4aa861035566b13a22a319963273e108134e419f8b6ae9c3a02c7a8d8a2ebe3 not found: ID does not exist" containerID="c4aa861035566b13a22a319963273e108134e419f8b6ae9c3a02c7a8d8a2ebe3" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.638140 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4aa861035566b13a22a319963273e108134e419f8b6ae9c3a02c7a8d8a2ebe3"} err="failed to get container status \"c4aa861035566b13a22a319963273e108134e419f8b6ae9c3a02c7a8d8a2ebe3\": rpc error: code = NotFound desc = could not find container \"c4aa861035566b13a22a319963273e108134e419f8b6ae9c3a02c7a8d8a2ebe3\": container with ID starting with c4aa861035566b13a22a319963273e108134e419f8b6ae9c3a02c7a8d8a2ebe3 not found: ID does not exist" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.638166 4992 scope.go:117] "RemoveContainer" containerID="05e403d26a756dfab5748e94d7e335757ac7836aefb4ea2bfd8e22d03bf85912" Jan 31 09:45:59 crc kubenswrapper[4992]: E0131 09:45:59.638564 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05e403d26a756dfab5748e94d7e335757ac7836aefb4ea2bfd8e22d03bf85912\": container with ID starting with 05e403d26a756dfab5748e94d7e335757ac7836aefb4ea2bfd8e22d03bf85912 not found: ID does not exist" containerID="05e403d26a756dfab5748e94d7e335757ac7836aefb4ea2bfd8e22d03bf85912" Jan 31 09:45:59 crc kubenswrapper[4992]: I0131 09:45:59.638586 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05e403d26a756dfab5748e94d7e335757ac7836aefb4ea2bfd8e22d03bf85912"} err="failed to get container status \"05e403d26a756dfab5748e94d7e335757ac7836aefb4ea2bfd8e22d03bf85912\": rpc error: code = NotFound desc = could not find container \"05e403d26a756dfab5748e94d7e335757ac7836aefb4ea2bfd8e22d03bf85912\": container with ID starting with 05e403d26a756dfab5748e94d7e335757ac7836aefb4ea2bfd8e22d03bf85912 not found: ID does not exist" Jan 31 09:46:00 crc kubenswrapper[4992]: I0131 09:46:00.561820 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0","Type":"ContainerStarted","Data":"d992542e80cac77081511efccca1267292bb1b5467915e780f7ac0e6f59bc0e5"} Jan 31 09:46:00 crc kubenswrapper[4992]: I0131 09:46:00.562263 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 31 09:46:00 crc kubenswrapper[4992]: I0131 09:46:00.563935 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8baf1ac1-7911-4667-a71d-1d4771c0d408","Type":"ContainerStarted","Data":"5506f8dfea72986300da3e97d94cd65048e6fd3ebcf7578cd30df3febed9e0ce"} Jan 31 09:46:00 crc kubenswrapper[4992]: I0131 09:46:00.588089 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.220622934 podStartE2EDuration="2.588067978s" podCreationTimestamp="2026-01-31 09:45:58 +0000 UTC" firstStartedPulling="2026-01-31 09:45:59.499660979 +0000 UTC m=+1255.471052966" lastFinishedPulling="2026-01-31 09:45:59.867106023 +0000 UTC m=+1255.838498010" observedRunningTime="2026-01-31 09:46:00.583759757 +0000 UTC m=+1256.555151764" watchObservedRunningTime="2026-01-31 09:46:00.588067978 +0000 UTC m=+1256.559459975" Jan 31 09:46:01 crc kubenswrapper[4992]: I0131 09:46:01.192629 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7c5aa9b-d41b-4820-9156-a42c3e79bb38" path="/var/lib/kubelet/pods/c7c5aa9b-d41b-4820-9156-a42c3e79bb38/volumes" Jan 31 09:46:01 crc kubenswrapper[4992]: I0131 09:46:01.574375 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8baf1ac1-7911-4667-a71d-1d4771c0d408","Type":"ContainerStarted","Data":"f42230d4f889942767cfe43a8e0becf9ab38583d0a9dacfdae306fadc15b072f"} Jan 31 09:46:03 crc kubenswrapper[4992]: I0131 09:46:03.606192 4992 generic.go:334] "Generic (PLEG): container finished" podID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerID="414193debb1e977402450dccc2bebffa41c25f0866c34a32f9d5ad1dbc2a8a05" exitCode=1 Jan 31 09:46:03 crc kubenswrapper[4992]: I0131 09:46:03.606956 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8baf1ac1-7911-4667-a71d-1d4771c0d408","Type":"ContainerDied","Data":"414193debb1e977402450dccc2bebffa41c25f0866c34a32f9d5ad1dbc2a8a05"} Jan 31 09:46:03 crc kubenswrapper[4992]: I0131 09:46:03.607221 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerName="ceilometer-central-agent" containerID="cri-o://78c2af8752a4bb392fd9a625bceba239eaf6474cf431da481590e151c75e7849" gracePeriod=30 Jan 31 09:46:03 crc kubenswrapper[4992]: I0131 09:46:03.608126 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerName="sg-core" containerID="cri-o://f42230d4f889942767cfe43a8e0becf9ab38583d0a9dacfdae306fadc15b072f" gracePeriod=30 Jan 31 09:46:03 crc kubenswrapper[4992]: I0131 09:46:03.608256 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerName="ceilometer-notification-agent" containerID="cri-o://5506f8dfea72986300da3e97d94cd65048e6fd3ebcf7578cd30df3febed9e0ce" gracePeriod=30 Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.214338 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6ftmf"] Jan 31 09:46:04 crc kubenswrapper[4992]: E0131 09:46:04.214787 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7c5aa9b-d41b-4820-9156-a42c3e79bb38" containerName="neutron-httpd" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.214805 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c5aa9b-d41b-4820-9156-a42c3e79bb38" containerName="neutron-httpd" Jan 31 09:46:04 crc kubenswrapper[4992]: E0131 09:46:04.214820 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7c5aa9b-d41b-4820-9156-a42c3e79bb38" containerName="neutron-api" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.214827 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7c5aa9b-d41b-4820-9156-a42c3e79bb38" containerName="neutron-api" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.214988 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7c5aa9b-d41b-4820-9156-a42c3e79bb38" containerName="neutron-api" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.215006 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7c5aa9b-d41b-4820-9156-a42c3e79bb38" containerName="neutron-httpd" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.215542 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.220204 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-m9scq" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.220402 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.220539 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.227844 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6ftmf"] Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.385904 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-config-data\") pod \"nova-cell0-conductor-db-sync-6ftmf\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.385962 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-scripts\") pod \"nova-cell0-conductor-db-sync-6ftmf\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.386257 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sgqq\" (UniqueName: \"kubernetes.io/projected/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-kube-api-access-6sgqq\") pod \"nova-cell0-conductor-db-sync-6ftmf\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.386314 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6ftmf\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.488291 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sgqq\" (UniqueName: \"kubernetes.io/projected/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-kube-api-access-6sgqq\") pod \"nova-cell0-conductor-db-sync-6ftmf\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.488629 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6ftmf\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.489592 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-config-data\") pod \"nova-cell0-conductor-db-sync-6ftmf\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.489631 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-scripts\") pod \"nova-cell0-conductor-db-sync-6ftmf\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.495073 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-scripts\") pod \"nova-cell0-conductor-db-sync-6ftmf\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.495242 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-config-data\") pod \"nova-cell0-conductor-db-sync-6ftmf\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.495663 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6ftmf\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.507021 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sgqq\" (UniqueName: \"kubernetes.io/projected/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-kube-api-access-6sgqq\") pod \"nova-cell0-conductor-db-sync-6ftmf\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.533118 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.621793 4992 generic.go:334] "Generic (PLEG): container finished" podID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerID="f42230d4f889942767cfe43a8e0becf9ab38583d0a9dacfdae306fadc15b072f" exitCode=2 Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.621833 4992 generic.go:334] "Generic (PLEG): container finished" podID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerID="5506f8dfea72986300da3e97d94cd65048e6fd3ebcf7578cd30df3febed9e0ce" exitCode=0 Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.621858 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8baf1ac1-7911-4667-a71d-1d4771c0d408","Type":"ContainerDied","Data":"f42230d4f889942767cfe43a8e0becf9ab38583d0a9dacfdae306fadc15b072f"} Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.621887 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8baf1ac1-7911-4667-a71d-1d4771c0d408","Type":"ContainerDied","Data":"5506f8dfea72986300da3e97d94cd65048e6fd3ebcf7578cd30df3febed9e0ce"} Jan 31 09:46:04 crc kubenswrapper[4992]: I0131 09:46:04.965978 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6ftmf"] Jan 31 09:46:04 crc kubenswrapper[4992]: W0131 09:46:04.970454 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cb0cadc_8ce2_4abd_8a60_461019fb6f6d.slice/crio-ac8b3ddd34b4b3e6520432cadeeb2ae38d6ac0b5f0c05c5e593da498cba0878a WatchSource:0}: Error finding container ac8b3ddd34b4b3e6520432cadeeb2ae38d6ac0b5f0c05c5e593da498cba0878a: Status 404 returned error can't find the container with id ac8b3ddd34b4b3e6520432cadeeb2ae38d6ac0b5f0c05c5e593da498cba0878a Jan 31 09:46:05 crc kubenswrapper[4992]: I0131 09:46:05.642707 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6ftmf" event={"ID":"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d","Type":"ContainerStarted","Data":"ac8b3ddd34b4b3e6520432cadeeb2ae38d6ac0b5f0c05c5e593da498cba0878a"} Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.411704 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.523187 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-scripts\") pod \"8baf1ac1-7911-4667-a71d-1d4771c0d408\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.523303 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-combined-ca-bundle\") pod \"8baf1ac1-7911-4667-a71d-1d4771c0d408\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.523405 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-config-data\") pod \"8baf1ac1-7911-4667-a71d-1d4771c0d408\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.523468 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8baf1ac1-7911-4667-a71d-1d4771c0d408-run-httpd\") pod \"8baf1ac1-7911-4667-a71d-1d4771c0d408\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.523498 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8baf1ac1-7911-4667-a71d-1d4771c0d408-log-httpd\") pod \"8baf1ac1-7911-4667-a71d-1d4771c0d408\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.523559 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-sg-core-conf-yaml\") pod \"8baf1ac1-7911-4667-a71d-1d4771c0d408\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.523606 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxwkx\" (UniqueName: \"kubernetes.io/projected/8baf1ac1-7911-4667-a71d-1d4771c0d408-kube-api-access-dxwkx\") pod \"8baf1ac1-7911-4667-a71d-1d4771c0d408\" (UID: \"8baf1ac1-7911-4667-a71d-1d4771c0d408\") " Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.524761 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8baf1ac1-7911-4667-a71d-1d4771c0d408-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8baf1ac1-7911-4667-a71d-1d4771c0d408" (UID: "8baf1ac1-7911-4667-a71d-1d4771c0d408"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.525021 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8baf1ac1-7911-4667-a71d-1d4771c0d408-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8baf1ac1-7911-4667-a71d-1d4771c0d408" (UID: "8baf1ac1-7911-4667-a71d-1d4771c0d408"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.534381 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8baf1ac1-7911-4667-a71d-1d4771c0d408-kube-api-access-dxwkx" (OuterVolumeSpecName: "kube-api-access-dxwkx") pod "8baf1ac1-7911-4667-a71d-1d4771c0d408" (UID: "8baf1ac1-7911-4667-a71d-1d4771c0d408"). InnerVolumeSpecName "kube-api-access-dxwkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.542586 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-scripts" (OuterVolumeSpecName: "scripts") pod "8baf1ac1-7911-4667-a71d-1d4771c0d408" (UID: "8baf1ac1-7911-4667-a71d-1d4771c0d408"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.554131 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8baf1ac1-7911-4667-a71d-1d4771c0d408" (UID: "8baf1ac1-7911-4667-a71d-1d4771c0d408"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.604594 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8baf1ac1-7911-4667-a71d-1d4771c0d408" (UID: "8baf1ac1-7911-4667-a71d-1d4771c0d408"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.623450 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-config-data" (OuterVolumeSpecName: "config-data") pod "8baf1ac1-7911-4667-a71d-1d4771c0d408" (UID: "8baf1ac1-7911-4667-a71d-1d4771c0d408"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.626235 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.626261 4992 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8baf1ac1-7911-4667-a71d-1d4771c0d408-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.626271 4992 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8baf1ac1-7911-4667-a71d-1d4771c0d408-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.626280 4992 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.626291 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxwkx\" (UniqueName: \"kubernetes.io/projected/8baf1ac1-7911-4667-a71d-1d4771c0d408-kube-api-access-dxwkx\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.626298 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.626306 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8baf1ac1-7911-4667-a71d-1d4771c0d408-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.663948 4992 generic.go:334] "Generic (PLEG): container finished" podID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerID="78c2af8752a4bb392fd9a625bceba239eaf6474cf431da481590e151c75e7849" exitCode=0 Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.663986 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8baf1ac1-7911-4667-a71d-1d4771c0d408","Type":"ContainerDied","Data":"78c2af8752a4bb392fd9a625bceba239eaf6474cf431da481590e151c75e7849"} Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.664025 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8baf1ac1-7911-4667-a71d-1d4771c0d408","Type":"ContainerDied","Data":"6acccfea62d3cc0280abf2212db33b52271a286f68d34e8f359c389aa2b14f32"} Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.664066 4992 scope.go:117] "RemoveContainer" containerID="414193debb1e977402450dccc2bebffa41c25f0866c34a32f9d5ad1dbc2a8a05" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.664078 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.698094 4992 scope.go:117] "RemoveContainer" containerID="f42230d4f889942767cfe43a8e0becf9ab38583d0a9dacfdae306fadc15b072f" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.712060 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.721082 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.743770 4992 scope.go:117] "RemoveContainer" containerID="5506f8dfea72986300da3e97d94cd65048e6fd3ebcf7578cd30df3febed9e0ce" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.751157 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:46:06 crc kubenswrapper[4992]: E0131 09:46:06.751667 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerName="ceilometer-notification-agent" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.751691 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerName="ceilometer-notification-agent" Jan 31 09:46:06 crc kubenswrapper[4992]: E0131 09:46:06.751717 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerName="proxy-httpd" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.751727 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerName="proxy-httpd" Jan 31 09:46:06 crc kubenswrapper[4992]: E0131 09:46:06.751748 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerName="ceilometer-central-agent" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.751757 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerName="ceilometer-central-agent" Jan 31 09:46:06 crc kubenswrapper[4992]: E0131 09:46:06.751775 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerName="sg-core" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.751783 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerName="sg-core" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.752013 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerName="ceilometer-central-agent" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.752028 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerName="ceilometer-notification-agent" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.752041 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerName="sg-core" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.752071 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" containerName="proxy-httpd" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.753938 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.764039 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.765710 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.765960 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.767165 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.782496 4992 scope.go:117] "RemoveContainer" containerID="78c2af8752a4bb392fd9a625bceba239eaf6474cf431da481590e151c75e7849" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.820661 4992 scope.go:117] "RemoveContainer" containerID="414193debb1e977402450dccc2bebffa41c25f0866c34a32f9d5ad1dbc2a8a05" Jan 31 09:46:06 crc kubenswrapper[4992]: E0131 09:46:06.821077 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"414193debb1e977402450dccc2bebffa41c25f0866c34a32f9d5ad1dbc2a8a05\": container with ID starting with 414193debb1e977402450dccc2bebffa41c25f0866c34a32f9d5ad1dbc2a8a05 not found: ID does not exist" containerID="414193debb1e977402450dccc2bebffa41c25f0866c34a32f9d5ad1dbc2a8a05" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.821107 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"414193debb1e977402450dccc2bebffa41c25f0866c34a32f9d5ad1dbc2a8a05"} err="failed to get container status \"414193debb1e977402450dccc2bebffa41c25f0866c34a32f9d5ad1dbc2a8a05\": rpc error: code = NotFound desc = could not find container \"414193debb1e977402450dccc2bebffa41c25f0866c34a32f9d5ad1dbc2a8a05\": container with ID starting with 414193debb1e977402450dccc2bebffa41c25f0866c34a32f9d5ad1dbc2a8a05 not found: ID does not exist" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.822529 4992 scope.go:117] "RemoveContainer" containerID="f42230d4f889942767cfe43a8e0becf9ab38583d0a9dacfdae306fadc15b072f" Jan 31 09:46:06 crc kubenswrapper[4992]: E0131 09:46:06.823081 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f42230d4f889942767cfe43a8e0becf9ab38583d0a9dacfdae306fadc15b072f\": container with ID starting with f42230d4f889942767cfe43a8e0becf9ab38583d0a9dacfdae306fadc15b072f not found: ID does not exist" containerID="f42230d4f889942767cfe43a8e0becf9ab38583d0a9dacfdae306fadc15b072f" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.823126 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42230d4f889942767cfe43a8e0becf9ab38583d0a9dacfdae306fadc15b072f"} err="failed to get container status \"f42230d4f889942767cfe43a8e0becf9ab38583d0a9dacfdae306fadc15b072f\": rpc error: code = NotFound desc = could not find container \"f42230d4f889942767cfe43a8e0becf9ab38583d0a9dacfdae306fadc15b072f\": container with ID starting with f42230d4f889942767cfe43a8e0becf9ab38583d0a9dacfdae306fadc15b072f not found: ID does not exist" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.823162 4992 scope.go:117] "RemoveContainer" containerID="5506f8dfea72986300da3e97d94cd65048e6fd3ebcf7578cd30df3febed9e0ce" Jan 31 09:46:06 crc kubenswrapper[4992]: E0131 09:46:06.823774 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5506f8dfea72986300da3e97d94cd65048e6fd3ebcf7578cd30df3febed9e0ce\": container with ID starting with 5506f8dfea72986300da3e97d94cd65048e6fd3ebcf7578cd30df3febed9e0ce not found: ID does not exist" containerID="5506f8dfea72986300da3e97d94cd65048e6fd3ebcf7578cd30df3febed9e0ce" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.823818 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5506f8dfea72986300da3e97d94cd65048e6fd3ebcf7578cd30df3febed9e0ce"} err="failed to get container status \"5506f8dfea72986300da3e97d94cd65048e6fd3ebcf7578cd30df3febed9e0ce\": rpc error: code = NotFound desc = could not find container \"5506f8dfea72986300da3e97d94cd65048e6fd3ebcf7578cd30df3febed9e0ce\": container with ID starting with 5506f8dfea72986300da3e97d94cd65048e6fd3ebcf7578cd30df3febed9e0ce not found: ID does not exist" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.823847 4992 scope.go:117] "RemoveContainer" containerID="78c2af8752a4bb392fd9a625bceba239eaf6474cf431da481590e151c75e7849" Jan 31 09:46:06 crc kubenswrapper[4992]: E0131 09:46:06.824252 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78c2af8752a4bb392fd9a625bceba239eaf6474cf431da481590e151c75e7849\": container with ID starting with 78c2af8752a4bb392fd9a625bceba239eaf6474cf431da481590e151c75e7849 not found: ID does not exist" containerID="78c2af8752a4bb392fd9a625bceba239eaf6474cf431da481590e151c75e7849" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.824281 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78c2af8752a4bb392fd9a625bceba239eaf6474cf431da481590e151c75e7849"} err="failed to get container status \"78c2af8752a4bb392fd9a625bceba239eaf6474cf431da481590e151c75e7849\": rpc error: code = NotFound desc = could not find container \"78c2af8752a4bb392fd9a625bceba239eaf6474cf431da481590e151c75e7849\": container with ID starting with 78c2af8752a4bb392fd9a625bceba239eaf6474cf431da481590e151c75e7849 not found: ID does not exist" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.832363 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.832528 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-scripts\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.832589 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ede922-dfdf-4118-9f79-672c3f785304-log-httpd\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.832712 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.832750 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-config-data\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.832775 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ede922-dfdf-4118-9f79-672c3f785304-run-httpd\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.832804 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.832862 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-644n9\" (UniqueName: \"kubernetes.io/projected/64ede922-dfdf-4118-9f79-672c3f785304-kube-api-access-644n9\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.934564 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-644n9\" (UniqueName: \"kubernetes.io/projected/64ede922-dfdf-4118-9f79-672c3f785304-kube-api-access-644n9\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.934664 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.934709 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-scripts\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.934734 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ede922-dfdf-4118-9f79-672c3f785304-log-httpd\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.934821 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.934846 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-config-data\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.934865 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ede922-dfdf-4118-9f79-672c3f785304-run-httpd\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.934893 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.935471 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ede922-dfdf-4118-9f79-672c3f785304-log-httpd\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.935565 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ede922-dfdf-4118-9f79-672c3f785304-run-httpd\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.938601 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-scripts\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.939312 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.939349 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-config-data\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.942822 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.943240 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:06 crc kubenswrapper[4992]: I0131 09:46:06.955745 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-644n9\" (UniqueName: \"kubernetes.io/projected/64ede922-dfdf-4118-9f79-672c3f785304-kube-api-access-644n9\") pod \"ceilometer-0\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " pod="openstack/ceilometer-0" Jan 31 09:46:07 crc kubenswrapper[4992]: I0131 09:46:07.087113 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:46:07 crc kubenswrapper[4992]: I0131 09:46:07.205183 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8baf1ac1-7911-4667-a71d-1d4771c0d408" path="/var/lib/kubelet/pods/8baf1ac1-7911-4667-a71d-1d4771c0d408/volumes" Jan 31 09:46:07 crc kubenswrapper[4992]: I0131 09:46:07.608780 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:46:07 crc kubenswrapper[4992]: W0131 09:46:07.616138 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64ede922_dfdf_4118_9f79_672c3f785304.slice/crio-703808565a15ba8831c8ca9c911183d13334e2a592166e52bb8ed64bd4c71e5a WatchSource:0}: Error finding container 703808565a15ba8831c8ca9c911183d13334e2a592166e52bb8ed64bd4c71e5a: Status 404 returned error can't find the container with id 703808565a15ba8831c8ca9c911183d13334e2a592166e52bb8ed64bd4c71e5a Jan 31 09:46:07 crc kubenswrapper[4992]: I0131 09:46:07.674210 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64ede922-dfdf-4118-9f79-672c3f785304","Type":"ContainerStarted","Data":"703808565a15ba8831c8ca9c911183d13334e2a592166e52bb8ed64bd4c71e5a"} Jan 31 09:46:07 crc kubenswrapper[4992]: I0131 09:46:07.742499 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:46:09 crc kubenswrapper[4992]: I0131 09:46:09.100324 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 31 09:46:14 crc kubenswrapper[4992]: I0131 09:46:14.745038 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64ede922-dfdf-4118-9f79-672c3f785304","Type":"ContainerStarted","Data":"9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a"} Jan 31 09:46:16 crc kubenswrapper[4992]: I0131 09:46:16.764992 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6ftmf" event={"ID":"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d","Type":"ContainerStarted","Data":"4d8c08f9453f44e563ea5382a90b0b7278aed11e3ba17611cf2b7fd5fd09175f"} Jan 31 09:46:16 crc kubenswrapper[4992]: I0131 09:46:16.769061 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64ede922-dfdf-4118-9f79-672c3f785304","Type":"ContainerStarted","Data":"d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23"} Jan 31 09:46:16 crc kubenswrapper[4992]: I0131 09:46:16.794561 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-6ftmf" podStartSLOduration=2.100309211 podStartE2EDuration="12.794534551s" podCreationTimestamp="2026-01-31 09:46:04 +0000 UTC" firstStartedPulling="2026-01-31 09:46:04.972185254 +0000 UTC m=+1260.943577241" lastFinishedPulling="2026-01-31 09:46:15.666410574 +0000 UTC m=+1271.637802581" observedRunningTime="2026-01-31 09:46:16.787827062 +0000 UTC m=+1272.759219049" watchObservedRunningTime="2026-01-31 09:46:16.794534551 +0000 UTC m=+1272.765926538" Jan 31 09:46:18 crc kubenswrapper[4992]: I0131 09:46:18.798695 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64ede922-dfdf-4118-9f79-672c3f785304","Type":"ContainerStarted","Data":"109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a"} Jan 31 09:46:22 crc kubenswrapper[4992]: I0131 09:46:22.836228 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64ede922-dfdf-4118-9f79-672c3f785304","Type":"ContainerStarted","Data":"2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e"} Jan 31 09:46:22 crc kubenswrapper[4992]: I0131 09:46:22.848620 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="ceilometer-central-agent" containerID="cri-o://9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a" gracePeriod=30 Jan 31 09:46:22 crc kubenswrapper[4992]: I0131 09:46:22.848692 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="sg-core" containerID="cri-o://109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a" gracePeriod=30 Jan 31 09:46:22 crc kubenswrapper[4992]: I0131 09:46:22.848724 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="ceilometer-notification-agent" containerID="cri-o://d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23" gracePeriod=30 Jan 31 09:46:22 crc kubenswrapper[4992]: I0131 09:46:22.848915 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="proxy-httpd" containerID="cri-o://2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e" gracePeriod=30 Jan 31 09:46:22 crc kubenswrapper[4992]: I0131 09:46:22.880244 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.996533341 podStartE2EDuration="16.880225124s" podCreationTimestamp="2026-01-31 09:46:06 +0000 UTC" firstStartedPulling="2026-01-31 09:46:07.618937323 +0000 UTC m=+1263.590329310" lastFinishedPulling="2026-01-31 09:46:21.502629105 +0000 UTC m=+1277.474021093" observedRunningTime="2026-01-31 09:46:22.87478637 +0000 UTC m=+1278.846178367" watchObservedRunningTime="2026-01-31 09:46:22.880225124 +0000 UTC m=+1278.851617111" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.538927 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.707062 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ede922-dfdf-4118-9f79-672c3f785304-run-httpd\") pod \"64ede922-dfdf-4118-9f79-672c3f785304\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.707350 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ede922-dfdf-4118-9f79-672c3f785304-log-httpd\") pod \"64ede922-dfdf-4118-9f79-672c3f785304\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.707371 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-config-data\") pod \"64ede922-dfdf-4118-9f79-672c3f785304\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.707452 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-sg-core-conf-yaml\") pod \"64ede922-dfdf-4118-9f79-672c3f785304\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.707513 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-combined-ca-bundle\") pod \"64ede922-dfdf-4118-9f79-672c3f785304\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.707878 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64ede922-dfdf-4118-9f79-672c3f785304-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "64ede922-dfdf-4118-9f79-672c3f785304" (UID: "64ede922-dfdf-4118-9f79-672c3f785304"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.708104 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-scripts\") pod \"64ede922-dfdf-4118-9f79-672c3f785304\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.708143 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64ede922-dfdf-4118-9f79-672c3f785304-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "64ede922-dfdf-4118-9f79-672c3f785304" (UID: "64ede922-dfdf-4118-9f79-672c3f785304"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.708203 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-ceilometer-tls-certs\") pod \"64ede922-dfdf-4118-9f79-672c3f785304\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.708237 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-644n9\" (UniqueName: \"kubernetes.io/projected/64ede922-dfdf-4118-9f79-672c3f785304-kube-api-access-644n9\") pod \"64ede922-dfdf-4118-9f79-672c3f785304\" (UID: \"64ede922-dfdf-4118-9f79-672c3f785304\") " Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.708702 4992 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ede922-dfdf-4118-9f79-672c3f785304-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.708724 4992 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64ede922-dfdf-4118-9f79-672c3f785304-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.715789 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-scripts" (OuterVolumeSpecName: "scripts") pod "64ede922-dfdf-4118-9f79-672c3f785304" (UID: "64ede922-dfdf-4118-9f79-672c3f785304"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.715802 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ede922-dfdf-4118-9f79-672c3f785304-kube-api-access-644n9" (OuterVolumeSpecName: "kube-api-access-644n9") pod "64ede922-dfdf-4118-9f79-672c3f785304" (UID: "64ede922-dfdf-4118-9f79-672c3f785304"). InnerVolumeSpecName "kube-api-access-644n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.739126 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "64ede922-dfdf-4118-9f79-672c3f785304" (UID: "64ede922-dfdf-4118-9f79-672c3f785304"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.762019 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "64ede922-dfdf-4118-9f79-672c3f785304" (UID: "64ede922-dfdf-4118-9f79-672c3f785304"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.769879 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64ede922-dfdf-4118-9f79-672c3f785304" (UID: "64ede922-dfdf-4118-9f79-672c3f785304"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.790983 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-config-data" (OuterVolumeSpecName: "config-data") pod "64ede922-dfdf-4118-9f79-672c3f785304" (UID: "64ede922-dfdf-4118-9f79-672c3f785304"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.811383 4992 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.811445 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.811456 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.811467 4992 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.811476 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-644n9\" (UniqueName: \"kubernetes.io/projected/64ede922-dfdf-4118-9f79-672c3f785304-kube-api-access-644n9\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.811484 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ede922-dfdf-4118-9f79-672c3f785304-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.845498 4992 generic.go:334] "Generic (PLEG): container finished" podID="64ede922-dfdf-4118-9f79-672c3f785304" containerID="2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e" exitCode=0 Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.845534 4992 generic.go:334] "Generic (PLEG): container finished" podID="64ede922-dfdf-4118-9f79-672c3f785304" containerID="109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a" exitCode=2 Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.845544 4992 generic.go:334] "Generic (PLEG): container finished" podID="64ede922-dfdf-4118-9f79-672c3f785304" containerID="d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23" exitCode=0 Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.845552 4992 generic.go:334] "Generic (PLEG): container finished" podID="64ede922-dfdf-4118-9f79-672c3f785304" containerID="9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a" exitCode=0 Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.845587 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64ede922-dfdf-4118-9f79-672c3f785304","Type":"ContainerDied","Data":"2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e"} Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.845616 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64ede922-dfdf-4118-9f79-672c3f785304","Type":"ContainerDied","Data":"109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a"} Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.845631 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64ede922-dfdf-4118-9f79-672c3f785304","Type":"ContainerDied","Data":"d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23"} Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.845642 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64ede922-dfdf-4118-9f79-672c3f785304","Type":"ContainerDied","Data":"9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a"} Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.845653 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"64ede922-dfdf-4118-9f79-672c3f785304","Type":"ContainerDied","Data":"703808565a15ba8831c8ca9c911183d13334e2a592166e52bb8ed64bd4c71e5a"} Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.845669 4992 scope.go:117] "RemoveContainer" containerID="2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.845821 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.880734 4992 scope.go:117] "RemoveContainer" containerID="109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.893463 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.904446 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.916485 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:46:23 crc kubenswrapper[4992]: E0131 09:46:23.916846 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="ceilometer-central-agent" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.916869 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="ceilometer-central-agent" Jan 31 09:46:23 crc kubenswrapper[4992]: E0131 09:46:23.916884 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="sg-core" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.916890 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="sg-core" Jan 31 09:46:23 crc kubenswrapper[4992]: E0131 09:46:23.916914 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="ceilometer-notification-agent" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.916929 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="ceilometer-notification-agent" Jan 31 09:46:23 crc kubenswrapper[4992]: E0131 09:46:23.916951 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="proxy-httpd" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.916964 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="proxy-httpd" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.917126 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="sg-core" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.917137 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="ceilometer-central-agent" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.917148 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="ceilometer-notification-agent" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.917159 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ede922-dfdf-4118-9f79-672c3f785304" containerName="proxy-httpd" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.919308 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.921998 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.922136 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.922253 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.928341 4992 scope.go:117] "RemoveContainer" containerID="d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23" Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.932163 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:46:23 crc kubenswrapper[4992]: I0131 09:46:23.961682 4992 scope.go:117] "RemoveContainer" containerID="9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.015987 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76229fae-f3ae-4090-89a2-43780cf2f2ba-log-httpd\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.016202 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76229fae-f3ae-4090-89a2-43780cf2f2ba-run-httpd\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.016309 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdlk4\" (UniqueName: \"kubernetes.io/projected/76229fae-f3ae-4090-89a2-43780cf2f2ba-kube-api-access-jdlk4\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.016436 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.016561 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-config-data\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.016717 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-scripts\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.016816 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.016886 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.031579 4992 scope.go:117] "RemoveContainer" containerID="2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e" Jan 31 09:46:24 crc kubenswrapper[4992]: E0131 09:46:24.032108 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e\": container with ID starting with 2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e not found: ID does not exist" containerID="2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.032171 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e"} err="failed to get container status \"2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e\": rpc error: code = NotFound desc = could not find container \"2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e\": container with ID starting with 2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.032198 4992 scope.go:117] "RemoveContainer" containerID="109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a" Jan 31 09:46:24 crc kubenswrapper[4992]: E0131 09:46:24.032585 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a\": container with ID starting with 109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a not found: ID does not exist" containerID="109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.032618 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a"} err="failed to get container status \"109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a\": rpc error: code = NotFound desc = could not find container \"109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a\": container with ID starting with 109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.032638 4992 scope.go:117] "RemoveContainer" containerID="d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23" Jan 31 09:46:24 crc kubenswrapper[4992]: E0131 09:46:24.032835 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23\": container with ID starting with d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23 not found: ID does not exist" containerID="d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.032865 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23"} err="failed to get container status \"d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23\": rpc error: code = NotFound desc = could not find container \"d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23\": container with ID starting with d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23 not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.032881 4992 scope.go:117] "RemoveContainer" containerID="9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a" Jan 31 09:46:24 crc kubenswrapper[4992]: E0131 09:46:24.033170 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a\": container with ID starting with 9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a not found: ID does not exist" containerID="9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.033296 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a"} err="failed to get container status \"9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a\": rpc error: code = NotFound desc = could not find container \"9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a\": container with ID starting with 9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.033394 4992 scope.go:117] "RemoveContainer" containerID="2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.033771 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e"} err="failed to get container status \"2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e\": rpc error: code = NotFound desc = could not find container \"2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e\": container with ID starting with 2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.033799 4992 scope.go:117] "RemoveContainer" containerID="109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.034001 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a"} err="failed to get container status \"109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a\": rpc error: code = NotFound desc = could not find container \"109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a\": container with ID starting with 109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.034074 4992 scope.go:117] "RemoveContainer" containerID="d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.034328 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23"} err="failed to get container status \"d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23\": rpc error: code = NotFound desc = could not find container \"d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23\": container with ID starting with d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23 not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.034358 4992 scope.go:117] "RemoveContainer" containerID="9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.034578 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a"} err="failed to get container status \"9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a\": rpc error: code = NotFound desc = could not find container \"9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a\": container with ID starting with 9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.034602 4992 scope.go:117] "RemoveContainer" containerID="2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.034770 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e"} err="failed to get container status \"2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e\": rpc error: code = NotFound desc = could not find container \"2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e\": container with ID starting with 2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.034792 4992 scope.go:117] "RemoveContainer" containerID="109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.035005 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a"} err="failed to get container status \"109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a\": rpc error: code = NotFound desc = could not find container \"109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a\": container with ID starting with 109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.035063 4992 scope.go:117] "RemoveContainer" containerID="d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.035789 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23"} err="failed to get container status \"d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23\": rpc error: code = NotFound desc = could not find container \"d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23\": container with ID starting with d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23 not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.035811 4992 scope.go:117] "RemoveContainer" containerID="9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.036186 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a"} err="failed to get container status \"9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a\": rpc error: code = NotFound desc = could not find container \"9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a\": container with ID starting with 9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.036309 4992 scope.go:117] "RemoveContainer" containerID="2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.036625 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e"} err="failed to get container status \"2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e\": rpc error: code = NotFound desc = could not find container \"2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e\": container with ID starting with 2273d005f2e1b2211758030620470f13081122f46bf402e5474f84887c16a49e not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.036644 4992 scope.go:117] "RemoveContainer" containerID="109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.036899 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a"} err="failed to get container status \"109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a\": rpc error: code = NotFound desc = could not find container \"109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a\": container with ID starting with 109c384469b973c5454faf5be293cc9443fe560bd42f0033cc423b41ad9aaf9a not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.036946 4992 scope.go:117] "RemoveContainer" containerID="d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.037175 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23"} err="failed to get container status \"d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23\": rpc error: code = NotFound desc = could not find container \"d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23\": container with ID starting with d2ff7eb5d8c281aa3e0752a8a5ac9f465d00d4ab28e72834e5d7feb0b7511e23 not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.037300 4992 scope.go:117] "RemoveContainer" containerID="9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.037635 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a"} err="failed to get container status \"9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a\": rpc error: code = NotFound desc = could not find container \"9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a\": container with ID starting with 9a62603a3f1ea8379e254c9410d29d7a017008035b64916c79bb4e00fc48459a not found: ID does not exist" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.118894 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76229fae-f3ae-4090-89a2-43780cf2f2ba-log-httpd\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.119135 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76229fae-f3ae-4090-89a2-43780cf2f2ba-run-httpd\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.119234 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdlk4\" (UniqueName: \"kubernetes.io/projected/76229fae-f3ae-4090-89a2-43780cf2f2ba-kube-api-access-jdlk4\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.119349 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.119454 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76229fae-f3ae-4090-89a2-43780cf2f2ba-run-httpd\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.119456 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-config-data\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.119652 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76229fae-f3ae-4090-89a2-43780cf2f2ba-log-httpd\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.119668 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-scripts\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.119745 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.119762 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.124246 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-scripts\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.124274 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.124781 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.124965 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.135890 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-config-data\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.142209 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdlk4\" (UniqueName: \"kubernetes.io/projected/76229fae-f3ae-4090-89a2-43780cf2f2ba-kube-api-access-jdlk4\") pod \"ceilometer-0\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.238368 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.811496 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:46:24 crc kubenswrapper[4992]: W0131 09:46:24.814553 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76229fae_f3ae_4090_89a2_43780cf2f2ba.slice/crio-6a3ac477702cee8d8d1aee11ed0559e13ad36400803d6991787d80b736cc1099 WatchSource:0}: Error finding container 6a3ac477702cee8d8d1aee11ed0559e13ad36400803d6991787d80b736cc1099: Status 404 returned error can't find the container with id 6a3ac477702cee8d8d1aee11ed0559e13ad36400803d6991787d80b736cc1099 Jan 31 09:46:24 crc kubenswrapper[4992]: I0131 09:46:24.869455 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76229fae-f3ae-4090-89a2-43780cf2f2ba","Type":"ContainerStarted","Data":"6a3ac477702cee8d8d1aee11ed0559e13ad36400803d6991787d80b736cc1099"} Jan 31 09:46:25 crc kubenswrapper[4992]: I0131 09:46:25.192681 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ede922-dfdf-4118-9f79-672c3f785304" path="/var/lib/kubelet/pods/64ede922-dfdf-4118-9f79-672c3f785304/volumes" Jan 31 09:46:25 crc kubenswrapper[4992]: I0131 09:46:25.881702 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76229fae-f3ae-4090-89a2-43780cf2f2ba","Type":"ContainerStarted","Data":"727919957ab28db2bf007f072728a920d1dcfe88af1797168fce28ae654ff27b"} Jan 31 09:46:26 crc kubenswrapper[4992]: I0131 09:46:26.891262 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76229fae-f3ae-4090-89a2-43780cf2f2ba","Type":"ContainerStarted","Data":"431505c3f1b6b076814eca1233194bd54942880895afb3ffad41d351626ab0dd"} Jan 31 09:46:26 crc kubenswrapper[4992]: I0131 09:46:26.891628 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76229fae-f3ae-4090-89a2-43780cf2f2ba","Type":"ContainerStarted","Data":"d07f4fd023a0f591608a8f30a49ef56a8de0c5a47a3c2d22e0bda8d02168d22e"} Jan 31 09:46:29 crc kubenswrapper[4992]: I0131 09:46:29.919168 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76229fae-f3ae-4090-89a2-43780cf2f2ba","Type":"ContainerStarted","Data":"f6e5016f2f0a2c211edbbc9e7e22d9a0d0486b229b93a9f2596adfb4f0a46ce4"} Jan 31 09:46:29 crc kubenswrapper[4992]: I0131 09:46:29.920722 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 31 09:46:29 crc kubenswrapper[4992]: I0131 09:46:29.941316 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.225555679 podStartE2EDuration="6.941290499s" podCreationTimestamp="2026-01-31 09:46:23 +0000 UTC" firstStartedPulling="2026-01-31 09:46:24.818410017 +0000 UTC m=+1280.789802014" lastFinishedPulling="2026-01-31 09:46:29.534144847 +0000 UTC m=+1285.505536834" observedRunningTime="2026-01-31 09:46:29.9370652 +0000 UTC m=+1285.908457267" watchObservedRunningTime="2026-01-31 09:46:29.941290499 +0000 UTC m=+1285.912682526" Jan 31 09:46:33 crc kubenswrapper[4992]: I0131 09:46:33.959621 4992 generic.go:334] "Generic (PLEG): container finished" podID="8cb0cadc-8ce2-4abd-8a60-461019fb6f6d" containerID="4d8c08f9453f44e563ea5382a90b0b7278aed11e3ba17611cf2b7fd5fd09175f" exitCode=0 Jan 31 09:46:33 crc kubenswrapper[4992]: I0131 09:46:33.959700 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6ftmf" event={"ID":"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d","Type":"ContainerDied","Data":"4d8c08f9453f44e563ea5382a90b0b7278aed11e3ba17611cf2b7fd5fd09175f"} Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.283784 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.420087 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sgqq\" (UniqueName: \"kubernetes.io/projected/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-kube-api-access-6sgqq\") pod \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.420192 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-combined-ca-bundle\") pod \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.420303 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-scripts\") pod \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.420375 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-config-data\") pod \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\" (UID: \"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d\") " Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.425635 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-scripts" (OuterVolumeSpecName: "scripts") pod "8cb0cadc-8ce2-4abd-8a60-461019fb6f6d" (UID: "8cb0cadc-8ce2-4abd-8a60-461019fb6f6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.425692 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-kube-api-access-6sgqq" (OuterVolumeSpecName: "kube-api-access-6sgqq") pod "8cb0cadc-8ce2-4abd-8a60-461019fb6f6d" (UID: "8cb0cadc-8ce2-4abd-8a60-461019fb6f6d"). InnerVolumeSpecName "kube-api-access-6sgqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.448355 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-config-data" (OuterVolumeSpecName: "config-data") pod "8cb0cadc-8ce2-4abd-8a60-461019fb6f6d" (UID: "8cb0cadc-8ce2-4abd-8a60-461019fb6f6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.449981 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cb0cadc-8ce2-4abd-8a60-461019fb6f6d" (UID: "8cb0cadc-8ce2-4abd-8a60-461019fb6f6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.523082 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sgqq\" (UniqueName: \"kubernetes.io/projected/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-kube-api-access-6sgqq\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.523124 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.523138 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.523150 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.979886 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6ftmf" event={"ID":"8cb0cadc-8ce2-4abd-8a60-461019fb6f6d","Type":"ContainerDied","Data":"ac8b3ddd34b4b3e6520432cadeeb2ae38d6ac0b5f0c05c5e593da498cba0878a"} Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.979964 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac8b3ddd34b4b3e6520432cadeeb2ae38d6ac0b5f0c05c5e593da498cba0878a" Jan 31 09:46:35 crc kubenswrapper[4992]: I0131 09:46:35.980057 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6ftmf" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.118928 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 31 09:46:36 crc kubenswrapper[4992]: E0131 09:46:36.119376 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb0cadc-8ce2-4abd-8a60-461019fb6f6d" containerName="nova-cell0-conductor-db-sync" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.119400 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb0cadc-8ce2-4abd-8a60-461019fb6f6d" containerName="nova-cell0-conductor-db-sync" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.119621 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb0cadc-8ce2-4abd-8a60-461019fb6f6d" containerName="nova-cell0-conductor-db-sync" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.120264 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.121982 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-m9scq" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.122914 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.133301 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.234488 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e726eda-44ef-49d4-9bc6-32efa2149de5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7e726eda-44ef-49d4-9bc6-32efa2149de5\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.234602 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z9tv\" (UniqueName: \"kubernetes.io/projected/7e726eda-44ef-49d4-9bc6-32efa2149de5-kube-api-access-4z9tv\") pod \"nova-cell0-conductor-0\" (UID: \"7e726eda-44ef-49d4-9bc6-32efa2149de5\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.234676 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e726eda-44ef-49d4-9bc6-32efa2149de5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7e726eda-44ef-49d4-9bc6-32efa2149de5\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.336328 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e726eda-44ef-49d4-9bc6-32efa2149de5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7e726eda-44ef-49d4-9bc6-32efa2149de5\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.336509 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z9tv\" (UniqueName: \"kubernetes.io/projected/7e726eda-44ef-49d4-9bc6-32efa2149de5-kube-api-access-4z9tv\") pod \"nova-cell0-conductor-0\" (UID: \"7e726eda-44ef-49d4-9bc6-32efa2149de5\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.336617 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e726eda-44ef-49d4-9bc6-32efa2149de5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7e726eda-44ef-49d4-9bc6-32efa2149de5\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.341061 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e726eda-44ef-49d4-9bc6-32efa2149de5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7e726eda-44ef-49d4-9bc6-32efa2149de5\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.342133 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e726eda-44ef-49d4-9bc6-32efa2149de5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7e726eda-44ef-49d4-9bc6-32efa2149de5\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.353980 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z9tv\" (UniqueName: \"kubernetes.io/projected/7e726eda-44ef-49d4-9bc6-32efa2149de5-kube-api-access-4z9tv\") pod \"nova-cell0-conductor-0\" (UID: \"7e726eda-44ef-49d4-9bc6-32efa2149de5\") " pod="openstack/nova-cell0-conductor-0" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.436700 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 31 09:46:36 crc kubenswrapper[4992]: I0131 09:46:36.849857 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 31 09:46:36 crc kubenswrapper[4992]: W0131 09:46:36.862932 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e726eda_44ef_49d4_9bc6_32efa2149de5.slice/crio-438e451876e364278ef7d49ceeab32af2f9a3e216a866ac4ea302d0ec76a9b2d WatchSource:0}: Error finding container 438e451876e364278ef7d49ceeab32af2f9a3e216a866ac4ea302d0ec76a9b2d: Status 404 returned error can't find the container with id 438e451876e364278ef7d49ceeab32af2f9a3e216a866ac4ea302d0ec76a9b2d Jan 31 09:46:37 crc kubenswrapper[4992]: I0131 09:46:37.004994 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7e726eda-44ef-49d4-9bc6-32efa2149de5","Type":"ContainerStarted","Data":"438e451876e364278ef7d49ceeab32af2f9a3e216a866ac4ea302d0ec76a9b2d"} Jan 31 09:46:38 crc kubenswrapper[4992]: I0131 09:46:38.014508 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7e726eda-44ef-49d4-9bc6-32efa2149de5","Type":"ContainerStarted","Data":"7f8a5c8c25aed778278da178bfd9984ab530d40badb5117d70561e345d3bee45"} Jan 31 09:46:38 crc kubenswrapper[4992]: I0131 09:46:38.014827 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 31 09:46:38 crc kubenswrapper[4992]: I0131 09:46:38.029935 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.029921229 podStartE2EDuration="2.029921229s" podCreationTimestamp="2026-01-31 09:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:46:38.028722475 +0000 UTC m=+1294.000114472" watchObservedRunningTime="2026-01-31 09:46:38.029921229 +0000 UTC m=+1294.001313216" Jan 31 09:46:46 crc kubenswrapper[4992]: I0131 09:46:46.463548 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 31 09:46:46 crc kubenswrapper[4992]: I0131 09:46:46.927348 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-dg67m"] Jan 31 09:46:46 crc kubenswrapper[4992]: I0131 09:46:46.928634 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:46 crc kubenswrapper[4992]: I0131 09:46:46.936312 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dg67m"] Jan 31 09:46:46 crc kubenswrapper[4992]: I0131 09:46:46.942936 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 31 09:46:46 crc kubenswrapper[4992]: I0131 09:46:46.943471 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.038748 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-config-data\") pod \"nova-cell0-cell-mapping-dg67m\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.038831 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtzz9\" (UniqueName: \"kubernetes.io/projected/1bd87507-7257-4df9-9690-0c9d8b9f7556-kube-api-access-dtzz9\") pod \"nova-cell0-cell-mapping-dg67m\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.038871 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-scripts\") pod \"nova-cell0-cell-mapping-dg67m\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.038944 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dg67m\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.130137 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.131343 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.138102 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.140252 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-config-data\") pod \"nova-cell0-cell-mapping-dg67m\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.140309 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtzz9\" (UniqueName: \"kubernetes.io/projected/1bd87507-7257-4df9-9690-0c9d8b9f7556-kube-api-access-dtzz9\") pod \"nova-cell0-cell-mapping-dg67m\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.140357 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-scripts\") pod \"nova-cell0-cell-mapping-dg67m\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.140418 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dg67m\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.154303 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.160003 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-config-data\") pod \"nova-cell0-cell-mapping-dg67m\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.162579 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dg67m\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.162676 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-scripts\") pod \"nova-cell0-cell-mapping-dg67m\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.166686 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtzz9\" (UniqueName: \"kubernetes.io/projected/1bd87507-7257-4df9-9690-0c9d8b9f7556-kube-api-access-dtzz9\") pod \"nova-cell0-cell-mapping-dg67m\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.212860 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.215009 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.219867 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.232510 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.244456 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215f5e28-cf6d-4221-ba1d-25d894544fcd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"215f5e28-cf6d-4221-ba1d-25d894544fcd\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.244589 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29p9c\" (UniqueName: \"kubernetes.io/projected/215f5e28-cf6d-4221-ba1d-25d894544fcd-kube-api-access-29p9c\") pod \"nova-cell1-novncproxy-0\" (UID: \"215f5e28-cf6d-4221-ba1d-25d894544fcd\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.244639 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215f5e28-cf6d-4221-ba1d-25d894544fcd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"215f5e28-cf6d-4221-ba1d-25d894544fcd\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.257127 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.258242 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.259487 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.271971 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.277412 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.336710 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.338999 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.341587 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.347434 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjg4t\" (UniqueName: \"kubernetes.io/projected/9638b204-4079-42e5-9ee7-3cba97d67894-kube-api-access-qjg4t\") pod \"nova-scheduler-0\" (UID: \"9638b204-4079-42e5-9ee7-3cba97d67894\") " pod="openstack/nova-scheduler-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.347587 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjwrx\" (UniqueName: \"kubernetes.io/projected/d22ab2ab-a644-45a9-8f93-4f968450b1d0-kube-api-access-kjwrx\") pod \"nova-api-0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " pod="openstack/nova-api-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.347612 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.347702 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215f5e28-cf6d-4221-ba1d-25d894544fcd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"215f5e28-cf6d-4221-ba1d-25d894544fcd\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.347855 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9638b204-4079-42e5-9ee7-3cba97d67894-config-data\") pod \"nova-scheduler-0\" (UID: \"9638b204-4079-42e5-9ee7-3cba97d67894\") " pod="openstack/nova-scheduler-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.347897 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22ab2ab-a644-45a9-8f93-4f968450b1d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " pod="openstack/nova-api-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.347918 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22ab2ab-a644-45a9-8f93-4f968450b1d0-logs\") pod \"nova-api-0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " pod="openstack/nova-api-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.347954 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22ab2ab-a644-45a9-8f93-4f968450b1d0-config-data\") pod \"nova-api-0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " pod="openstack/nova-api-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.348020 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29p9c\" (UniqueName: \"kubernetes.io/projected/215f5e28-cf6d-4221-ba1d-25d894544fcd-kube-api-access-29p9c\") pod \"nova-cell1-novncproxy-0\" (UID: \"215f5e28-cf6d-4221-ba1d-25d894544fcd\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.348095 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9638b204-4079-42e5-9ee7-3cba97d67894-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9638b204-4079-42e5-9ee7-3cba97d67894\") " pod="openstack/nova-scheduler-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.348119 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215f5e28-cf6d-4221-ba1d-25d894544fcd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"215f5e28-cf6d-4221-ba1d-25d894544fcd\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.371251 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215f5e28-cf6d-4221-ba1d-25d894544fcd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"215f5e28-cf6d-4221-ba1d-25d894544fcd\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.372343 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215f5e28-cf6d-4221-ba1d-25d894544fcd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"215f5e28-cf6d-4221-ba1d-25d894544fcd\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.398620 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29p9c\" (UniqueName: \"kubernetes.io/projected/215f5e28-cf6d-4221-ba1d-25d894544fcd-kube-api-access-29p9c\") pod \"nova-cell1-novncproxy-0\" (UID: \"215f5e28-cf6d-4221-ba1d-25d894544fcd\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.441232 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-td2ws"] Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.442846 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.449227 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22ab2ab-a644-45a9-8f93-4f968450b1d0-config-data\") pod \"nova-api-0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " pod="openstack/nova-api-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.449312 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9638b204-4079-42e5-9ee7-3cba97d67894-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9638b204-4079-42e5-9ee7-3cba97d67894\") " pod="openstack/nova-scheduler-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.449340 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fntbk\" (UniqueName: \"kubernetes.io/projected/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-kube-api-access-fntbk\") pod \"nova-metadata-0\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " pod="openstack/nova-metadata-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.449368 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjg4t\" (UniqueName: \"kubernetes.io/projected/9638b204-4079-42e5-9ee7-3cba97d67894-kube-api-access-qjg4t\") pod \"nova-scheduler-0\" (UID: \"9638b204-4079-42e5-9ee7-3cba97d67894\") " pod="openstack/nova-scheduler-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.449417 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjwrx\" (UniqueName: \"kubernetes.io/projected/d22ab2ab-a644-45a9-8f93-4f968450b1d0-kube-api-access-kjwrx\") pod \"nova-api-0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " pod="openstack/nova-api-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.449550 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-config-data\") pod \"nova-metadata-0\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " pod="openstack/nova-metadata-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.449578 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-logs\") pod \"nova-metadata-0\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " pod="openstack/nova-metadata-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.449638 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " pod="openstack/nova-metadata-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.449672 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9638b204-4079-42e5-9ee7-3cba97d67894-config-data\") pod \"nova-scheduler-0\" (UID: \"9638b204-4079-42e5-9ee7-3cba97d67894\") " pod="openstack/nova-scheduler-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.449696 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22ab2ab-a644-45a9-8f93-4f968450b1d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " pod="openstack/nova-api-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.449712 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22ab2ab-a644-45a9-8f93-4f968450b1d0-logs\") pod \"nova-api-0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " pod="openstack/nova-api-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.450054 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-td2ws"] Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.450100 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22ab2ab-a644-45a9-8f93-4f968450b1d0-logs\") pod \"nova-api-0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " pod="openstack/nova-api-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.486179 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9638b204-4079-42e5-9ee7-3cba97d67894-config-data\") pod \"nova-scheduler-0\" (UID: \"9638b204-4079-42e5-9ee7-3cba97d67894\") " pod="openstack/nova-scheduler-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.486224 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9638b204-4079-42e5-9ee7-3cba97d67894-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9638b204-4079-42e5-9ee7-3cba97d67894\") " pod="openstack/nova-scheduler-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.491072 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjg4t\" (UniqueName: \"kubernetes.io/projected/9638b204-4079-42e5-9ee7-3cba97d67894-kube-api-access-qjg4t\") pod \"nova-scheduler-0\" (UID: \"9638b204-4079-42e5-9ee7-3cba97d67894\") " pod="openstack/nova-scheduler-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.501672 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22ab2ab-a644-45a9-8f93-4f968450b1d0-config-data\") pod \"nova-api-0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " pod="openstack/nova-api-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.504388 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjwrx\" (UniqueName: \"kubernetes.io/projected/d22ab2ab-a644-45a9-8f93-4f968450b1d0-kube-api-access-kjwrx\") pod \"nova-api-0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " pod="openstack/nova-api-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.518989 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22ab2ab-a644-45a9-8f93-4f968450b1d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " pod="openstack/nova-api-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.539647 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.556974 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-td2ws\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.557047 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " pod="openstack/nova-metadata-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.557221 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89xb2\" (UniqueName: \"kubernetes.io/projected/ad622716-6050-414e-b1c3-5a84c0881d16-kube-api-access-89xb2\") pod \"dnsmasq-dns-8b8cf6657-td2ws\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.557288 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-td2ws\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.557310 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-config\") pod \"dnsmasq-dns-8b8cf6657-td2ws\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.557347 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fntbk\" (UniqueName: \"kubernetes.io/projected/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-kube-api-access-fntbk\") pod \"nova-metadata-0\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " pod="openstack/nova-metadata-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.557469 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-td2ws\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.557497 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-config-data\") pod \"nova-metadata-0\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " pod="openstack/nova-metadata-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.557525 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-logs\") pod \"nova-metadata-0\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " pod="openstack/nova-metadata-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.557894 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-logs\") pod \"nova-metadata-0\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " pod="openstack/nova-metadata-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.562682 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " pod="openstack/nova-metadata-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.564078 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-config-data\") pod \"nova-metadata-0\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " pod="openstack/nova-metadata-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.569751 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.575044 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fntbk\" (UniqueName: \"kubernetes.io/projected/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-kube-api-access-fntbk\") pod \"nova-metadata-0\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " pod="openstack/nova-metadata-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.596041 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.659169 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-td2ws\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.659549 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89xb2\" (UniqueName: \"kubernetes.io/projected/ad622716-6050-414e-b1c3-5a84c0881d16-kube-api-access-89xb2\") pod \"dnsmasq-dns-8b8cf6657-td2ws\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.659587 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-td2ws\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.659605 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-config\") pod \"dnsmasq-dns-8b8cf6657-td2ws\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.659668 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-td2ws\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.660165 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-td2ws\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.662154 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-config\") pod \"dnsmasq-dns-8b8cf6657-td2ws\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.664318 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-td2ws\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.664831 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-td2ws\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.691875 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89xb2\" (UniqueName: \"kubernetes.io/projected/ad622716-6050-414e-b1c3-5a84c0881d16-kube-api-access-89xb2\") pod \"dnsmasq-dns-8b8cf6657-td2ws\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.791055 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.802752 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.888829 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dg67m"] Jan 31 09:46:47 crc kubenswrapper[4992]: W0131 09:46:47.902264 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bd87507_7257_4df9_9690_0c9d8b9f7556.slice/crio-92375c41a85b17af34e760bcc6eb5a9ef59fa7e1b510d77fb598cc409fcb55a5 WatchSource:0}: Error finding container 92375c41a85b17af34e760bcc6eb5a9ef59fa7e1b510d77fb598cc409fcb55a5: Status 404 returned error can't find the container with id 92375c41a85b17af34e760bcc6eb5a9ef59fa7e1b510d77fb598cc409fcb55a5 Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.944720 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hdjxg"] Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.945887 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.948366 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.948435 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 31 09:46:47 crc kubenswrapper[4992]: I0131 09:46:47.953379 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hdjxg"] Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.037707 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.060313 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.072642 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-scripts\") pod \"nova-cell1-conductor-db-sync-hdjxg\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.072724 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-config-data\") pod \"nova-cell1-conductor-db-sync-hdjxg\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.072790 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hdjxg\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.072869 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nsfq\" (UniqueName: \"kubernetes.io/projected/d31a3cea-f382-4709-b605-b6474a9c722c-kube-api-access-6nsfq\") pod \"nova-cell1-conductor-db-sync-hdjxg\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.111910 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"215f5e28-cf6d-4221-ba1d-25d894544fcd","Type":"ContainerStarted","Data":"fec23e07e4ab9042c92fdac36824ab94383b1ac7cd7e8abae60e9ccc636f60df"} Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.113391 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dg67m" event={"ID":"1bd87507-7257-4df9-9690-0c9d8b9f7556","Type":"ContainerStarted","Data":"92375c41a85b17af34e760bcc6eb5a9ef59fa7e1b510d77fb598cc409fcb55a5"} Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.116235 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d22ab2ab-a644-45a9-8f93-4f968450b1d0","Type":"ContainerStarted","Data":"706e12aaf26feb6701868a085254ed166990c1b66b1e0424943255f54ea9cb09"} Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.144059 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-dg67m" podStartSLOduration=2.144041664 podStartE2EDuration="2.144041664s" podCreationTimestamp="2026-01-31 09:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:46:48.130087341 +0000 UTC m=+1304.101479348" watchObservedRunningTime="2026-01-31 09:46:48.144041664 +0000 UTC m=+1304.115433651" Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.174489 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-scripts\") pod \"nova-cell1-conductor-db-sync-hdjxg\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.174544 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-config-data\") pod \"nova-cell1-conductor-db-sync-hdjxg\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.174585 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hdjxg\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.174643 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nsfq\" (UniqueName: \"kubernetes.io/projected/d31a3cea-f382-4709-b605-b6474a9c722c-kube-api-access-6nsfq\") pod \"nova-cell1-conductor-db-sync-hdjxg\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.181905 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hdjxg\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.182023 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-config-data\") pod \"nova-cell1-conductor-db-sync-hdjxg\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.185553 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-scripts\") pod \"nova-cell1-conductor-db-sync-hdjxg\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.194453 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nsfq\" (UniqueName: \"kubernetes.io/projected/d31a3cea-f382-4709-b605-b6474a9c722c-kube-api-access-6nsfq\") pod \"nova-cell1-conductor-db-sync-hdjxg\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.223441 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.275113 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:48 crc kubenswrapper[4992]: W0131 09:46:48.347745 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad622716_6050_414e_b1c3_5a84c0881d16.slice/crio-1f28a2416838130bf6466d0e33ce0fde48c6ee329b6c92bb5251cf451cccfe82 WatchSource:0}: Error finding container 1f28a2416838130bf6466d0e33ce0fde48c6ee329b6c92bb5251cf451cccfe82: Status 404 returned error can't find the container with id 1f28a2416838130bf6466d0e33ce0fde48c6ee329b6c92bb5251cf451cccfe82 Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.352611 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-td2ws"] Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.426475 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:46:48 crc kubenswrapper[4992]: I0131 09:46:48.764530 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hdjxg"] Jan 31 09:46:48 crc kubenswrapper[4992]: W0131 09:46:48.773382 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd31a3cea_f382_4709_b605_b6474a9c722c.slice/crio-956d85363fdf2d5d6e0cbd6d6145efd29b90b7bed65494a3062740f038045a64 WatchSource:0}: Error finding container 956d85363fdf2d5d6e0cbd6d6145efd29b90b7bed65494a3062740f038045a64: Status 404 returned error can't find the container with id 956d85363fdf2d5d6e0cbd6d6145efd29b90b7bed65494a3062740f038045a64 Jan 31 09:46:49 crc kubenswrapper[4992]: I0131 09:46:49.128047 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hdjxg" event={"ID":"d31a3cea-f382-4709-b605-b6474a9c722c","Type":"ContainerStarted","Data":"7b66c990fb38769ee48c362c0adb66670154b9e1bf0caa1e4f4668b01b780c37"} Jan 31 09:46:49 crc kubenswrapper[4992]: I0131 09:46:49.128292 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hdjxg" event={"ID":"d31a3cea-f382-4709-b605-b6474a9c722c","Type":"ContainerStarted","Data":"956d85363fdf2d5d6e0cbd6d6145efd29b90b7bed65494a3062740f038045a64"} Jan 31 09:46:49 crc kubenswrapper[4992]: I0131 09:46:49.132241 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e6ef2449-6751-470b-9c8e-51bd7e4afb1b","Type":"ContainerStarted","Data":"e8ae7dd973efe9ed9c99d7e8bf43e8ae9f9c377e0f4b641cee5f015fbffaea78"} Jan 31 09:46:49 crc kubenswrapper[4992]: I0131 09:46:49.133771 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dg67m" event={"ID":"1bd87507-7257-4df9-9690-0c9d8b9f7556","Type":"ContainerStarted","Data":"3dd7393a58500f1a270c541f212d0a7a9031ca20e9c9cb77f7528b5986633832"} Jan 31 09:46:49 crc kubenswrapper[4992]: I0131 09:46:49.140025 4992 generic.go:334] "Generic (PLEG): container finished" podID="ad622716-6050-414e-b1c3-5a84c0881d16" containerID="8051ce022b6ddcf48880aa23758db55c937862f3d55aed824388d8195cf800f6" exitCode=0 Jan 31 09:46:49 crc kubenswrapper[4992]: I0131 09:46:49.140107 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" event={"ID":"ad622716-6050-414e-b1c3-5a84c0881d16","Type":"ContainerDied","Data":"8051ce022b6ddcf48880aa23758db55c937862f3d55aed824388d8195cf800f6"} Jan 31 09:46:49 crc kubenswrapper[4992]: I0131 09:46:49.140140 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" event={"ID":"ad622716-6050-414e-b1c3-5a84c0881d16","Type":"ContainerStarted","Data":"1f28a2416838130bf6466d0e33ce0fde48c6ee329b6c92bb5251cf451cccfe82"} Jan 31 09:46:49 crc kubenswrapper[4992]: I0131 09:46:49.147385 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9638b204-4079-42e5-9ee7-3cba97d67894","Type":"ContainerStarted","Data":"e5a817c88af66c2c79a0c6cbd7ebaa62005dbd5d7b1f51e26b4ea160117a6fe1"} Jan 31 09:46:49 crc kubenswrapper[4992]: I0131 09:46:49.159115 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-hdjxg" podStartSLOduration=2.159099476 podStartE2EDuration="2.159099476s" podCreationTimestamp="2026-01-31 09:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:46:49.145835273 +0000 UTC m=+1305.117227280" watchObservedRunningTime="2026-01-31 09:46:49.159099476 +0000 UTC m=+1305.130491463" Jan 31 09:46:50 crc kubenswrapper[4992]: I0131 09:46:50.936173 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:46:50 crc kubenswrapper[4992]: I0131 09:46:50.960269 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.177188 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e6ef2449-6751-470b-9c8e-51bd7e4afb1b","Type":"ContainerStarted","Data":"3f65401227e1f702ac900a90869255bd326ea7a63339fa81298e5e849ebe921e"} Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.177685 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e6ef2449-6751-470b-9c8e-51bd7e4afb1b","Type":"ContainerStarted","Data":"e4546e7dda719482e9b9788bea385875656053a0bc3ba673f0f8a756a4f61b5c"} Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.177535 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e6ef2449-6751-470b-9c8e-51bd7e4afb1b" containerName="nova-metadata-metadata" containerID="cri-o://3f65401227e1f702ac900a90869255bd326ea7a63339fa81298e5e849ebe921e" gracePeriod=30 Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.177319 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e6ef2449-6751-470b-9c8e-51bd7e4afb1b" containerName="nova-metadata-log" containerID="cri-o://e4546e7dda719482e9b9788bea385875656053a0bc3ba673f0f8a756a4f61b5c" gracePeriod=30 Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.179823 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="215f5e28-cf6d-4221-ba1d-25d894544fcd" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://09c9850174a459563d3a4ab9bdf25a941e0ac7dbb69c9b81d08f5dc666be6053" gracePeriod=30 Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.179912 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"215f5e28-cf6d-4221-ba1d-25d894544fcd","Type":"ContainerStarted","Data":"09c9850174a459563d3a4ab9bdf25a941e0ac7dbb69c9b81d08f5dc666be6053"} Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.190322 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" event={"ID":"ad622716-6050-414e-b1c3-5a84c0881d16","Type":"ContainerStarted","Data":"d8073dcb0ad1a519a5f0fcc8b3b2d7af060581856e49c38ea24df8525e40ad5e"} Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.190449 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.191979 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9638b204-4079-42e5-9ee7-3cba97d67894","Type":"ContainerStarted","Data":"5bf46410ead28306d3de3f4a39baf16883fa3361b347eaa300a57686e337170f"} Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.193519 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d22ab2ab-a644-45a9-8f93-4f968450b1d0","Type":"ContainerStarted","Data":"e1ee4492e78ea8910058b5fa2cc30174873f24d044f5739f87296318881df258"} Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.193546 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d22ab2ab-a644-45a9-8f93-4f968450b1d0","Type":"ContainerStarted","Data":"dc83dfe0fdc0b0cfcd84d6f9ac1a6f275645ceffe5e5993c49a115019dc30b92"} Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.205684 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.357915418 podStartE2EDuration="5.205666552s" podCreationTimestamp="2026-01-31 09:46:47 +0000 UTC" firstStartedPulling="2026-01-31 09:46:48.460700997 +0000 UTC m=+1304.432092984" lastFinishedPulling="2026-01-31 09:46:51.308452111 +0000 UTC m=+1307.279844118" observedRunningTime="2026-01-31 09:46:52.205073805 +0000 UTC m=+1308.176465822" watchObservedRunningTime="2026-01-31 09:46:52.205666552 +0000 UTC m=+1308.177058539" Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.224275 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" podStartSLOduration=5.224260746 podStartE2EDuration="5.224260746s" podCreationTimestamp="2026-01-31 09:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:46:52.219932014 +0000 UTC m=+1308.191324011" watchObservedRunningTime="2026-01-31 09:46:52.224260746 +0000 UTC m=+1308.195652733" Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.241100 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.164215479 podStartE2EDuration="5.2410813s" podCreationTimestamp="2026-01-31 09:46:47 +0000 UTC" firstStartedPulling="2026-01-31 09:46:48.229621045 +0000 UTC m=+1304.201013032" lastFinishedPulling="2026-01-31 09:46:51.306486866 +0000 UTC m=+1307.277878853" observedRunningTime="2026-01-31 09:46:52.23611683 +0000 UTC m=+1308.207508817" watchObservedRunningTime="2026-01-31 09:46:52.2410813 +0000 UTC m=+1308.212473287" Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.258879 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.006740302 podStartE2EDuration="5.258863631s" podCreationTimestamp="2026-01-31 09:46:47 +0000 UTC" firstStartedPulling="2026-01-31 09:46:48.063807953 +0000 UTC m=+1304.035199940" lastFinishedPulling="2026-01-31 09:46:51.315931262 +0000 UTC m=+1307.287323269" observedRunningTime="2026-01-31 09:46:52.250790973 +0000 UTC m=+1308.222182970" watchObservedRunningTime="2026-01-31 09:46:52.258863631 +0000 UTC m=+1308.230255608" Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.264439 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.015259093 podStartE2EDuration="5.264427198s" podCreationTimestamp="2026-01-31 09:46:47 +0000 UTC" firstStartedPulling="2026-01-31 09:46:48.058196665 +0000 UTC m=+1304.029588652" lastFinishedPulling="2026-01-31 09:46:51.30736477 +0000 UTC m=+1307.278756757" observedRunningTime="2026-01-31 09:46:52.263709957 +0000 UTC m=+1308.235101974" watchObservedRunningTime="2026-01-31 09:46:52.264427198 +0000 UTC m=+1308.235819185" Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.540749 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.596525 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.803124 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 31 09:46:52 crc kubenswrapper[4992]: I0131 09:46:52.803181 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.204547 4992 generic.go:334] "Generic (PLEG): container finished" podID="e6ef2449-6751-470b-9c8e-51bd7e4afb1b" containerID="3f65401227e1f702ac900a90869255bd326ea7a63339fa81298e5e849ebe921e" exitCode=0 Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.204872 4992 generic.go:334] "Generic (PLEG): container finished" podID="e6ef2449-6751-470b-9c8e-51bd7e4afb1b" containerID="e4546e7dda719482e9b9788bea385875656053a0bc3ba673f0f8a756a4f61b5c" exitCode=143 Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.204594 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e6ef2449-6751-470b-9c8e-51bd7e4afb1b","Type":"ContainerDied","Data":"3f65401227e1f702ac900a90869255bd326ea7a63339fa81298e5e849ebe921e"} Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.205500 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e6ef2449-6751-470b-9c8e-51bd7e4afb1b","Type":"ContainerDied","Data":"e4546e7dda719482e9b9788bea385875656053a0bc3ba673f0f8a756a4f61b5c"} Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.298590 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.377612 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fntbk\" (UniqueName: \"kubernetes.io/projected/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-kube-api-access-fntbk\") pod \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.377690 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-logs\") pod \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.377725 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-combined-ca-bundle\") pod \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.377874 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-config-data\") pod \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\" (UID: \"e6ef2449-6751-470b-9c8e-51bd7e4afb1b\") " Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.378048 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-logs" (OuterVolumeSpecName: "logs") pod "e6ef2449-6751-470b-9c8e-51bd7e4afb1b" (UID: "e6ef2449-6751-470b-9c8e-51bd7e4afb1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.379148 4992 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.395730 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-kube-api-access-fntbk" (OuterVolumeSpecName: "kube-api-access-fntbk") pod "e6ef2449-6751-470b-9c8e-51bd7e4afb1b" (UID: "e6ef2449-6751-470b-9c8e-51bd7e4afb1b"). InnerVolumeSpecName "kube-api-access-fntbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.403094 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-config-data" (OuterVolumeSpecName: "config-data") pod "e6ef2449-6751-470b-9c8e-51bd7e4afb1b" (UID: "e6ef2449-6751-470b-9c8e-51bd7e4afb1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.404360 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6ef2449-6751-470b-9c8e-51bd7e4afb1b" (UID: "e6ef2449-6751-470b-9c8e-51bd7e4afb1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.481552 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fntbk\" (UniqueName: \"kubernetes.io/projected/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-kube-api-access-fntbk\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.481588 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:53 crc kubenswrapper[4992]: I0131 09:46:53.481599 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ef2449-6751-470b-9c8e-51bd7e4afb1b-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.214348 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.217592 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e6ef2449-6751-470b-9c8e-51bd7e4afb1b","Type":"ContainerDied","Data":"e8ae7dd973efe9ed9c99d7e8bf43e8ae9f9c377e0f4b641cee5f015fbffaea78"} Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.219485 4992 scope.go:117] "RemoveContainer" containerID="3f65401227e1f702ac900a90869255bd326ea7a63339fa81298e5e849ebe921e" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.247996 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.248779 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.267508 4992 scope.go:117] "RemoveContainer" containerID="e4546e7dda719482e9b9788bea385875656053a0bc3ba673f0f8a756a4f61b5c" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.261696 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.287290 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:46:54 crc kubenswrapper[4992]: E0131 09:46:54.287745 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ef2449-6751-470b-9c8e-51bd7e4afb1b" containerName="nova-metadata-log" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.287767 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ef2449-6751-470b-9c8e-51bd7e4afb1b" containerName="nova-metadata-log" Jan 31 09:46:54 crc kubenswrapper[4992]: E0131 09:46:54.287787 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ef2449-6751-470b-9c8e-51bd7e4afb1b" containerName="nova-metadata-metadata" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.287795 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ef2449-6751-470b-9c8e-51bd7e4afb1b" containerName="nova-metadata-metadata" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.288026 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6ef2449-6751-470b-9c8e-51bd7e4afb1b" containerName="nova-metadata-log" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.288055 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6ef2449-6751-470b-9c8e-51bd7e4afb1b" containerName="nova-metadata-metadata" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.290642 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.294328 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.294328 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.295266 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.295374 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870499f5-2755-4603-929e-cfb5e79e7dea-logs\") pod \"nova-metadata-0\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.295699 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-config-data\") pod \"nova-metadata-0\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.295834 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.295877 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c944l\" (UniqueName: \"kubernetes.io/projected/870499f5-2755-4603-929e-cfb5e79e7dea-kube-api-access-c944l\") pod \"nova-metadata-0\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.315882 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.398344 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.398406 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870499f5-2755-4603-929e-cfb5e79e7dea-logs\") pod \"nova-metadata-0\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.398449 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-config-data\") pod \"nova-metadata-0\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.398481 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.398499 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c944l\" (UniqueName: \"kubernetes.io/projected/870499f5-2755-4603-929e-cfb5e79e7dea-kube-api-access-c944l\") pod \"nova-metadata-0\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.402458 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870499f5-2755-4603-929e-cfb5e79e7dea-logs\") pod \"nova-metadata-0\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.406049 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.406709 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-config-data\") pod \"nova-metadata-0\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.412819 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.415152 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c944l\" (UniqueName: \"kubernetes.io/projected/870499f5-2755-4603-929e-cfb5e79e7dea-kube-api-access-c944l\") pod \"nova-metadata-0\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " pod="openstack/nova-metadata-0" Jan 31 09:46:54 crc kubenswrapper[4992]: I0131 09:46:54.621602 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:46:55 crc kubenswrapper[4992]: I0131 09:46:55.106067 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:46:55 crc kubenswrapper[4992]: I0131 09:46:55.193242 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6ef2449-6751-470b-9c8e-51bd7e4afb1b" path="/var/lib/kubelet/pods/e6ef2449-6751-470b-9c8e-51bd7e4afb1b/volumes" Jan 31 09:46:55 crc kubenswrapper[4992]: I0131 09:46:55.225298 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"870499f5-2755-4603-929e-cfb5e79e7dea","Type":"ContainerStarted","Data":"0cbf7421c52b54cfab9b85a5199f7734d11ca515a6a1c6e53c9821b8caa58b14"} Jan 31 09:46:56 crc kubenswrapper[4992]: I0131 09:46:56.236128 4992 generic.go:334] "Generic (PLEG): container finished" podID="d31a3cea-f382-4709-b605-b6474a9c722c" containerID="7b66c990fb38769ee48c362c0adb66670154b9e1bf0caa1e4f4668b01b780c37" exitCode=0 Jan 31 09:46:56 crc kubenswrapper[4992]: I0131 09:46:56.236542 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hdjxg" event={"ID":"d31a3cea-f382-4709-b605-b6474a9c722c","Type":"ContainerDied","Data":"7b66c990fb38769ee48c362c0adb66670154b9e1bf0caa1e4f4668b01b780c37"} Jan 31 09:46:56 crc kubenswrapper[4992]: I0131 09:46:56.246822 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"870499f5-2755-4603-929e-cfb5e79e7dea","Type":"ContainerStarted","Data":"aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61"} Jan 31 09:46:56 crc kubenswrapper[4992]: I0131 09:46:56.246866 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"870499f5-2755-4603-929e-cfb5e79e7dea","Type":"ContainerStarted","Data":"4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390"} Jan 31 09:46:56 crc kubenswrapper[4992]: I0131 09:46:56.260566 4992 generic.go:334] "Generic (PLEG): container finished" podID="1bd87507-7257-4df9-9690-0c9d8b9f7556" containerID="3dd7393a58500f1a270c541f212d0a7a9031ca20e9c9cb77f7528b5986633832" exitCode=0 Jan 31 09:46:56 crc kubenswrapper[4992]: I0131 09:46:56.260623 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dg67m" event={"ID":"1bd87507-7257-4df9-9690-0c9d8b9f7556","Type":"ContainerDied","Data":"3dd7393a58500f1a270c541f212d0a7a9031ca20e9c9cb77f7528b5986633832"} Jan 31 09:46:56 crc kubenswrapper[4992]: I0131 09:46:56.282790 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.282772036 podStartE2EDuration="2.282772036s" podCreationTimestamp="2026-01-31 09:46:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:46:56.273764752 +0000 UTC m=+1312.245156739" watchObservedRunningTime="2026-01-31 09:46:56.282772036 +0000 UTC m=+1312.254164023" Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.571511 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.571832 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.596909 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.645893 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.773822 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.779662 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.794583 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.862561 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-psm5z"] Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.864767 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58db5546cc-psm5z" podUID="5ae0611b-2e81-4f74-9acd-b52769aadd42" containerName="dnsmasq-dns" containerID="cri-o://8259faf3193d7dcf85bd42def3fe64eeb0fb5a1232744a428e679080fea37690" gracePeriod=10 Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.964880 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-config-data\") pod \"1bd87507-7257-4df9-9690-0c9d8b9f7556\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.964947 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtzz9\" (UniqueName: \"kubernetes.io/projected/1bd87507-7257-4df9-9690-0c9d8b9f7556-kube-api-access-dtzz9\") pod \"1bd87507-7257-4df9-9690-0c9d8b9f7556\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.964980 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-combined-ca-bundle\") pod \"1bd87507-7257-4df9-9690-0c9d8b9f7556\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.965013 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-config-data\") pod \"d31a3cea-f382-4709-b605-b6474a9c722c\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.965109 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-scripts\") pod \"1bd87507-7257-4df9-9690-0c9d8b9f7556\" (UID: \"1bd87507-7257-4df9-9690-0c9d8b9f7556\") " Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.965226 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nsfq\" (UniqueName: \"kubernetes.io/projected/d31a3cea-f382-4709-b605-b6474a9c722c-kube-api-access-6nsfq\") pod \"d31a3cea-f382-4709-b605-b6474a9c722c\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.965246 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-scripts\") pod \"d31a3cea-f382-4709-b605-b6474a9c722c\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.965268 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-combined-ca-bundle\") pod \"d31a3cea-f382-4709-b605-b6474a9c722c\" (UID: \"d31a3cea-f382-4709-b605-b6474a9c722c\") " Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.971780 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bd87507-7257-4df9-9690-0c9d8b9f7556-kube-api-access-dtzz9" (OuterVolumeSpecName: "kube-api-access-dtzz9") pod "1bd87507-7257-4df9-9690-0c9d8b9f7556" (UID: "1bd87507-7257-4df9-9690-0c9d8b9f7556"). InnerVolumeSpecName "kube-api-access-dtzz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.975610 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-scripts" (OuterVolumeSpecName: "scripts") pod "1bd87507-7257-4df9-9690-0c9d8b9f7556" (UID: "1bd87507-7257-4df9-9690-0c9d8b9f7556"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.976194 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-scripts" (OuterVolumeSpecName: "scripts") pod "d31a3cea-f382-4709-b605-b6474a9c722c" (UID: "d31a3cea-f382-4709-b605-b6474a9c722c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:57 crc kubenswrapper[4992]: I0131 09:46:57.982658 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d31a3cea-f382-4709-b605-b6474a9c722c-kube-api-access-6nsfq" (OuterVolumeSpecName: "kube-api-access-6nsfq") pod "d31a3cea-f382-4709-b605-b6474a9c722c" (UID: "d31a3cea-f382-4709-b605-b6474a9c722c"). InnerVolumeSpecName "kube-api-access-6nsfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.009733 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-config-data" (OuterVolumeSpecName: "config-data") pod "d31a3cea-f382-4709-b605-b6474a9c722c" (UID: "d31a3cea-f382-4709-b605-b6474a9c722c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.011158 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bd87507-7257-4df9-9690-0c9d8b9f7556" (UID: "1bd87507-7257-4df9-9690-0c9d8b9f7556"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.028053 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-config-data" (OuterVolumeSpecName: "config-data") pod "1bd87507-7257-4df9-9690-0c9d8b9f7556" (UID: "1bd87507-7257-4df9-9690-0c9d8b9f7556"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.028959 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d31a3cea-f382-4709-b605-b6474a9c722c" (UID: "d31a3cea-f382-4709-b605-b6474a9c722c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.069518 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.069566 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nsfq\" (UniqueName: \"kubernetes.io/projected/d31a3cea-f382-4709-b605-b6474a9c722c-kube-api-access-6nsfq\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.069581 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.069593 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.069604 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.069615 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtzz9\" (UniqueName: \"kubernetes.io/projected/1bd87507-7257-4df9-9690-0c9d8b9f7556-kube-api-access-dtzz9\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.069626 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd87507-7257-4df9-9690-0c9d8b9f7556-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.069636 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d31a3cea-f382-4709-b605-b6474a9c722c-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.278312 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hdjxg" event={"ID":"d31a3cea-f382-4709-b605-b6474a9c722c","Type":"ContainerDied","Data":"956d85363fdf2d5d6e0cbd6d6145efd29b90b7bed65494a3062740f038045a64"} Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.278363 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="956d85363fdf2d5d6e0cbd6d6145efd29b90b7bed65494a3062740f038045a64" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.278441 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hdjxg" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.296435 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dg67m" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.296440 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dg67m" event={"ID":"1bd87507-7257-4df9-9690-0c9d8b9f7556","Type":"ContainerDied","Data":"92375c41a85b17af34e760bcc6eb5a9ef59fa7e1b510d77fb598cc409fcb55a5"} Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.296473 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92375c41a85b17af34e760bcc6eb5a9ef59fa7e1b510d77fb598cc409fcb55a5" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.340384 4992 generic.go:334] "Generic (PLEG): container finished" podID="5ae0611b-2e81-4f74-9acd-b52769aadd42" containerID="8259faf3193d7dcf85bd42def3fe64eeb0fb5a1232744a428e679080fea37690" exitCode=0 Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.341546 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-psm5z" event={"ID":"5ae0611b-2e81-4f74-9acd-b52769aadd42","Type":"ContainerDied","Data":"8259faf3193d7dcf85bd42def3fe64eeb0fb5a1232744a428e679080fea37690"} Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.364159 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 31 09:46:58 crc kubenswrapper[4992]: E0131 09:46:58.364645 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd87507-7257-4df9-9690-0c9d8b9f7556" containerName="nova-manage" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.364664 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd87507-7257-4df9-9690-0c9d8b9f7556" containerName="nova-manage" Jan 31 09:46:58 crc kubenswrapper[4992]: E0131 09:46:58.364675 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d31a3cea-f382-4709-b605-b6474a9c722c" containerName="nova-cell1-conductor-db-sync" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.364682 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31a3cea-f382-4709-b605-b6474a9c722c" containerName="nova-cell1-conductor-db-sync" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.364897 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd87507-7257-4df9-9690-0c9d8b9f7556" containerName="nova-manage" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.364910 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d31a3cea-f382-4709-b605-b6474a9c722c" containerName="nova-cell1-conductor-db-sync" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.365503 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.368367 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.375776 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.394231 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.415303 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.491659 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743e90b7-a043-4412-b05a-d9d36b5e9cf8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"743e90b7-a043-4412-b05a-d9d36b5e9cf8\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.491809 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7pdt\" (UniqueName: \"kubernetes.io/projected/743e90b7-a043-4412-b05a-d9d36b5e9cf8-kube-api-access-d7pdt\") pod \"nova-cell1-conductor-0\" (UID: \"743e90b7-a043-4412-b05a-d9d36b5e9cf8\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.491866 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743e90b7-a043-4412-b05a-d9d36b5e9cf8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"743e90b7-a043-4412-b05a-d9d36b5e9cf8\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.538721 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.538977 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d22ab2ab-a644-45a9-8f93-4f968450b1d0" containerName="nova-api-log" containerID="cri-o://dc83dfe0fdc0b0cfcd84d6f9ac1a6f275645ceffe5e5993c49a115019dc30b92" gracePeriod=30 Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.539058 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d22ab2ab-a644-45a9-8f93-4f968450b1d0" containerName="nova-api-api" containerID="cri-o://e1ee4492e78ea8910058b5fa2cc30174873f24d044f5739f87296318881df258" gracePeriod=30 Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.544090 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d22ab2ab-a644-45a9-8f93-4f968450b1d0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.178:8774/\": EOF" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.544531 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d22ab2ab-a644-45a9-8f93-4f968450b1d0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.178:8774/\": EOF" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.561687 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.561902 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="870499f5-2755-4603-929e-cfb5e79e7dea" containerName="nova-metadata-log" containerID="cri-o://4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390" gracePeriod=30 Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.562023 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="870499f5-2755-4603-929e-cfb5e79e7dea" containerName="nova-metadata-metadata" containerID="cri-o://aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61" gracePeriod=30 Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.593191 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-ovsdbserver-sb\") pod \"5ae0611b-2e81-4f74-9acd-b52769aadd42\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.593704 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-config\") pod \"5ae0611b-2e81-4f74-9acd-b52769aadd42\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.593782 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-dns-svc\") pod \"5ae0611b-2e81-4f74-9acd-b52769aadd42\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.593817 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrgqs\" (UniqueName: \"kubernetes.io/projected/5ae0611b-2e81-4f74-9acd-b52769aadd42-kube-api-access-lrgqs\") pod \"5ae0611b-2e81-4f74-9acd-b52769aadd42\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.593864 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-ovsdbserver-nb\") pod \"5ae0611b-2e81-4f74-9acd-b52769aadd42\" (UID: \"5ae0611b-2e81-4f74-9acd-b52769aadd42\") " Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.594148 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7pdt\" (UniqueName: \"kubernetes.io/projected/743e90b7-a043-4412-b05a-d9d36b5e9cf8-kube-api-access-d7pdt\") pod \"nova-cell1-conductor-0\" (UID: \"743e90b7-a043-4412-b05a-d9d36b5e9cf8\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.594233 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743e90b7-a043-4412-b05a-d9d36b5e9cf8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"743e90b7-a043-4412-b05a-d9d36b5e9cf8\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.594292 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743e90b7-a043-4412-b05a-d9d36b5e9cf8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"743e90b7-a043-4412-b05a-d9d36b5e9cf8\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.601320 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ae0611b-2e81-4f74-9acd-b52769aadd42-kube-api-access-lrgqs" (OuterVolumeSpecName: "kube-api-access-lrgqs") pod "5ae0611b-2e81-4f74-9acd-b52769aadd42" (UID: "5ae0611b-2e81-4f74-9acd-b52769aadd42"). InnerVolumeSpecName "kube-api-access-lrgqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.608530 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743e90b7-a043-4412-b05a-d9d36b5e9cf8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"743e90b7-a043-4412-b05a-d9d36b5e9cf8\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.613909 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743e90b7-a043-4412-b05a-d9d36b5e9cf8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"743e90b7-a043-4412-b05a-d9d36b5e9cf8\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.624627 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7pdt\" (UniqueName: \"kubernetes.io/projected/743e90b7-a043-4412-b05a-d9d36b5e9cf8-kube-api-access-d7pdt\") pod \"nova-cell1-conductor-0\" (UID: \"743e90b7-a043-4412-b05a-d9d36b5e9cf8\") " pod="openstack/nova-cell1-conductor-0" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.652390 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5ae0611b-2e81-4f74-9acd-b52769aadd42" (UID: "5ae0611b-2e81-4f74-9acd-b52769aadd42"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.660063 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5ae0611b-2e81-4f74-9acd-b52769aadd42" (UID: "5ae0611b-2e81-4f74-9acd-b52769aadd42"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.683917 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-config" (OuterVolumeSpecName: "config") pod "5ae0611b-2e81-4f74-9acd-b52769aadd42" (UID: "5ae0611b-2e81-4f74-9acd-b52769aadd42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.694253 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.695370 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ae0611b-2e81-4f74-9acd-b52769aadd42" (UID: "5ae0611b-2e81-4f74-9acd-b52769aadd42"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.695797 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.695817 4992 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.695829 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrgqs\" (UniqueName: \"kubernetes.io/projected/5ae0611b-2e81-4f74-9acd-b52769aadd42-kube-api-access-lrgqs\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.695840 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:58 crc kubenswrapper[4992]: I0131 09:46:58.695851 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ae0611b-2e81-4f74-9acd-b52769aadd42-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.063310 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.170218 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 31 09:46:59 crc kubenswrapper[4992]: W0131 09:46:59.171166 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod743e90b7_a043_4412_b05a_d9d36b5e9cf8.slice/crio-cacd95864baf11737f879454f599a34a9701bfcb74aaf21196303bd6fb707ace WatchSource:0}: Error finding container cacd95864baf11737f879454f599a34a9701bfcb74aaf21196303bd6fb707ace: Status 404 returned error can't find the container with id cacd95864baf11737f879454f599a34a9701bfcb74aaf21196303bd6fb707ace Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.201638 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.218715 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-nova-metadata-tls-certs\") pod \"870499f5-2755-4603-929e-cfb5e79e7dea\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.218790 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-combined-ca-bundle\") pod \"870499f5-2755-4603-929e-cfb5e79e7dea\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.256037 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "870499f5-2755-4603-929e-cfb5e79e7dea" (UID: "870499f5-2755-4603-929e-cfb5e79e7dea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.277089 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "870499f5-2755-4603-929e-cfb5e79e7dea" (UID: "870499f5-2755-4603-929e-cfb5e79e7dea"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.329093 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-config-data\") pod \"870499f5-2755-4603-929e-cfb5e79e7dea\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.329166 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c944l\" (UniqueName: \"kubernetes.io/projected/870499f5-2755-4603-929e-cfb5e79e7dea-kube-api-access-c944l\") pod \"870499f5-2755-4603-929e-cfb5e79e7dea\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.329209 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870499f5-2755-4603-929e-cfb5e79e7dea-logs\") pod \"870499f5-2755-4603-929e-cfb5e79e7dea\" (UID: \"870499f5-2755-4603-929e-cfb5e79e7dea\") " Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.329747 4992 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.329761 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.329971 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/870499f5-2755-4603-929e-cfb5e79e7dea-logs" (OuterVolumeSpecName: "logs") pod "870499f5-2755-4603-929e-cfb5e79e7dea" (UID: "870499f5-2755-4603-929e-cfb5e79e7dea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.332716 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870499f5-2755-4603-929e-cfb5e79e7dea-kube-api-access-c944l" (OuterVolumeSpecName: "kube-api-access-c944l") pod "870499f5-2755-4603-929e-cfb5e79e7dea" (UID: "870499f5-2755-4603-929e-cfb5e79e7dea"). InnerVolumeSpecName "kube-api-access-c944l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.352484 4992 generic.go:334] "Generic (PLEG): container finished" podID="d22ab2ab-a644-45a9-8f93-4f968450b1d0" containerID="dc83dfe0fdc0b0cfcd84d6f9ac1a6f275645ceffe5e5993c49a115019dc30b92" exitCode=143 Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.352548 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d22ab2ab-a644-45a9-8f93-4f968450b1d0","Type":"ContainerDied","Data":"dc83dfe0fdc0b0cfcd84d6f9ac1a6f275645ceffe5e5993c49a115019dc30b92"} Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.356975 4992 generic.go:334] "Generic (PLEG): container finished" podID="870499f5-2755-4603-929e-cfb5e79e7dea" containerID="aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61" exitCode=0 Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.357006 4992 generic.go:334] "Generic (PLEG): container finished" podID="870499f5-2755-4603-929e-cfb5e79e7dea" containerID="4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390" exitCode=143 Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.357047 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"870499f5-2755-4603-929e-cfb5e79e7dea","Type":"ContainerDied","Data":"aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61"} Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.357072 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"870499f5-2755-4603-929e-cfb5e79e7dea","Type":"ContainerDied","Data":"4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390"} Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.357082 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"870499f5-2755-4603-929e-cfb5e79e7dea","Type":"ContainerDied","Data":"0cbf7421c52b54cfab9b85a5199f7734d11ca515a6a1c6e53c9821b8caa58b14"} Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.357096 4992 scope.go:117] "RemoveContainer" containerID="aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.357228 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.360547 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-config-data" (OuterVolumeSpecName: "config-data") pod "870499f5-2755-4603-929e-cfb5e79e7dea" (UID: "870499f5-2755-4603-929e-cfb5e79e7dea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.360612 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"743e90b7-a043-4412-b05a-d9d36b5e9cf8","Type":"ContainerStarted","Data":"cacd95864baf11737f879454f599a34a9701bfcb74aaf21196303bd6fb707ace"} Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.360871 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.363617 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-psm5z" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.363771 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-psm5z" event={"ID":"5ae0611b-2e81-4f74-9acd-b52769aadd42","Type":"ContainerDied","Data":"d728275de79f33bf97691fe37a91d54baba3bc1623cc0f5f13bab36c7b1eee0e"} Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.377817 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.377798037 podStartE2EDuration="1.377798037s" podCreationTimestamp="2026-01-31 09:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:46:59.374987188 +0000 UTC m=+1315.346379195" watchObservedRunningTime="2026-01-31 09:46:59.377798037 +0000 UTC m=+1315.349190024" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.382759 4992 scope.go:117] "RemoveContainer" containerID="4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.397903 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-psm5z"] Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.405688 4992 scope.go:117] "RemoveContainer" containerID="aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61" Jan 31 09:46:59 crc kubenswrapper[4992]: E0131 09:46:59.406160 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61\": container with ID starting with aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61 not found: ID does not exist" containerID="aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.406192 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61"} err="failed to get container status \"aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61\": rpc error: code = NotFound desc = could not find container \"aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61\": container with ID starting with aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61 not found: ID does not exist" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.406213 4992 scope.go:117] "RemoveContainer" containerID="4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390" Jan 31 09:46:59 crc kubenswrapper[4992]: E0131 09:46:59.406576 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390\": container with ID starting with 4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390 not found: ID does not exist" containerID="4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.406595 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390"} err="failed to get container status \"4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390\": rpc error: code = NotFound desc = could not find container \"4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390\": container with ID starting with 4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390 not found: ID does not exist" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.406607 4992 scope.go:117] "RemoveContainer" containerID="aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.406821 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61"} err="failed to get container status \"aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61\": rpc error: code = NotFound desc = could not find container \"aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61\": container with ID starting with aaac47832bbcc212580114156f8bdb3fea04cf25a5a8f493ea02dd791a7b5b61 not found: ID does not exist" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.406908 4992 scope.go:117] "RemoveContainer" containerID="4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.407760 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390"} err="failed to get container status \"4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390\": rpc error: code = NotFound desc = could not find container \"4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390\": container with ID starting with 4ed1e3f8dad6a2a9f3de0a9cb185fec273acaeb8a8da19721248e59cc8536390 not found: ID does not exist" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.407790 4992 scope.go:117] "RemoveContainer" containerID="8259faf3193d7dcf85bd42def3fe64eeb0fb5a1232744a428e679080fea37690" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.413970 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-psm5z"] Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.425465 4992 scope.go:117] "RemoveContainer" containerID="6d938c484e6def42119b72f5b2bc79c6223d33f1c2b5107158025f9ea22a007e" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.431081 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/870499f5-2755-4603-929e-cfb5e79e7dea-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.431107 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c944l\" (UniqueName: \"kubernetes.io/projected/870499f5-2755-4603-929e-cfb5e79e7dea-kube-api-access-c944l\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.431116 4992 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/870499f5-2755-4603-929e-cfb5e79e7dea-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.693302 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.701575 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.714501 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:46:59 crc kubenswrapper[4992]: E0131 09:46:59.715265 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae0611b-2e81-4f74-9acd-b52769aadd42" containerName="init" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.715293 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae0611b-2e81-4f74-9acd-b52769aadd42" containerName="init" Jan 31 09:46:59 crc kubenswrapper[4992]: E0131 09:46:59.715333 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870499f5-2755-4603-929e-cfb5e79e7dea" containerName="nova-metadata-log" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.715342 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="870499f5-2755-4603-929e-cfb5e79e7dea" containerName="nova-metadata-log" Jan 31 09:46:59 crc kubenswrapper[4992]: E0131 09:46:59.715352 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870499f5-2755-4603-929e-cfb5e79e7dea" containerName="nova-metadata-metadata" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.715363 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="870499f5-2755-4603-929e-cfb5e79e7dea" containerName="nova-metadata-metadata" Jan 31 09:46:59 crc kubenswrapper[4992]: E0131 09:46:59.715374 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae0611b-2e81-4f74-9acd-b52769aadd42" containerName="dnsmasq-dns" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.715379 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae0611b-2e81-4f74-9acd-b52769aadd42" containerName="dnsmasq-dns" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.715607 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ae0611b-2e81-4f74-9acd-b52769aadd42" containerName="dnsmasq-dns" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.715621 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="870499f5-2755-4603-929e-cfb5e79e7dea" containerName="nova-metadata-log" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.715638 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="870499f5-2755-4603-929e-cfb5e79e7dea" containerName="nova-metadata-metadata" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.716537 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.724651 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.725385 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.734276 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.736696 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.736756 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-config-data\") pod \"nova-metadata-0\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.736817 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.736923 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4wj8\" (UniqueName: \"kubernetes.io/projected/d91af037-5b1b-4543-810e-06667d38a865-kube-api-access-f4wj8\") pod \"nova-metadata-0\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.736951 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d91af037-5b1b-4543-810e-06667d38a865-logs\") pod \"nova-metadata-0\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.838597 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4wj8\" (UniqueName: \"kubernetes.io/projected/d91af037-5b1b-4543-810e-06667d38a865-kube-api-access-f4wj8\") pod \"nova-metadata-0\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.838683 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d91af037-5b1b-4543-810e-06667d38a865-logs\") pod \"nova-metadata-0\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.838744 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.838792 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-config-data\") pod \"nova-metadata-0\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.838827 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.839431 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d91af037-5b1b-4543-810e-06667d38a865-logs\") pod \"nova-metadata-0\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.845971 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.846618 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-config-data\") pod \"nova-metadata-0\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.847316 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " pod="openstack/nova-metadata-0" Jan 31 09:46:59 crc kubenswrapper[4992]: I0131 09:46:59.856987 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4wj8\" (UniqueName: \"kubernetes.io/projected/d91af037-5b1b-4543-810e-06667d38a865-kube-api-access-f4wj8\") pod \"nova-metadata-0\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " pod="openstack/nova-metadata-0" Jan 31 09:47:00 crc kubenswrapper[4992]: I0131 09:47:00.032834 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:47:00 crc kubenswrapper[4992]: I0131 09:47:00.374620 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"743e90b7-a043-4412-b05a-d9d36b5e9cf8","Type":"ContainerStarted","Data":"c2efd44c83a6bc71b4eda16672de4bb69748f44ade10c4034916e0ff6f8ecf6b"} Jan 31 09:47:00 crc kubenswrapper[4992]: I0131 09:47:00.376063 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9638b204-4079-42e5-9ee7-3cba97d67894" containerName="nova-scheduler-scheduler" containerID="cri-o://5bf46410ead28306d3de3f4a39baf16883fa3361b347eaa300a57686e337170f" gracePeriod=30 Jan 31 09:47:00 crc kubenswrapper[4992]: I0131 09:47:00.501973 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:47:01 crc kubenswrapper[4992]: I0131 09:47:01.194268 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ae0611b-2e81-4f74-9acd-b52769aadd42" path="/var/lib/kubelet/pods/5ae0611b-2e81-4f74-9acd-b52769aadd42/volumes" Jan 31 09:47:01 crc kubenswrapper[4992]: I0131 09:47:01.195491 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870499f5-2755-4603-929e-cfb5e79e7dea" path="/var/lib/kubelet/pods/870499f5-2755-4603-929e-cfb5e79e7dea/volumes" Jan 31 09:47:01 crc kubenswrapper[4992]: I0131 09:47:01.384613 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d91af037-5b1b-4543-810e-06667d38a865","Type":"ContainerStarted","Data":"1367720b0b5897db3d75f03fc07bb07dd938e476f5f83e76ddaa1f0c9dc87a55"} Jan 31 09:47:01 crc kubenswrapper[4992]: I0131 09:47:01.384657 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d91af037-5b1b-4543-810e-06667d38a865","Type":"ContainerStarted","Data":"0f93012b081cbd9e15257cfe62962baadc5bb987e0540c1599416a8988394541"} Jan 31 09:47:01 crc kubenswrapper[4992]: I0131 09:47:01.384671 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d91af037-5b1b-4543-810e-06667d38a865","Type":"ContainerStarted","Data":"26e4672c696ec6e1be2d536bf7979cc84cbcab03747671c1da8736f2a6369a27"} Jan 31 09:47:01 crc kubenswrapper[4992]: I0131 09:47:01.404212 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.404192857 podStartE2EDuration="2.404192857s" podCreationTimestamp="2026-01-31 09:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:47:01.403696133 +0000 UTC m=+1317.375088140" watchObservedRunningTime="2026-01-31 09:47:01.404192857 +0000 UTC m=+1317.375584844" Jan 31 09:47:02 crc kubenswrapper[4992]: E0131 09:47:02.598443 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5bf46410ead28306d3de3f4a39baf16883fa3361b347eaa300a57686e337170f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 31 09:47:02 crc kubenswrapper[4992]: E0131 09:47:02.600240 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5bf46410ead28306d3de3f4a39baf16883fa3361b347eaa300a57686e337170f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 31 09:47:02 crc kubenswrapper[4992]: E0131 09:47:02.601898 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5bf46410ead28306d3de3f4a39baf16883fa3361b347eaa300a57686e337170f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 31 09:47:02 crc kubenswrapper[4992]: E0131 09:47:02.601978 4992 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9638b204-4079-42e5-9ee7-3cba97d67894" containerName="nova-scheduler-scheduler" Jan 31 09:47:03 crc kubenswrapper[4992]: I0131 09:47:03.403657 4992 generic.go:334] "Generic (PLEG): container finished" podID="9638b204-4079-42e5-9ee7-3cba97d67894" containerID="5bf46410ead28306d3de3f4a39baf16883fa3361b347eaa300a57686e337170f" exitCode=0 Jan 31 09:47:03 crc kubenswrapper[4992]: I0131 09:47:03.403771 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9638b204-4079-42e5-9ee7-3cba97d67894","Type":"ContainerDied","Data":"5bf46410ead28306d3de3f4a39baf16883fa3361b347eaa300a57686e337170f"} Jan 31 09:47:03 crc kubenswrapper[4992]: I0131 09:47:03.403942 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9638b204-4079-42e5-9ee7-3cba97d67894","Type":"ContainerDied","Data":"e5a817c88af66c2c79a0c6cbd7ebaa62005dbd5d7b1f51e26b4ea160117a6fe1"} Jan 31 09:47:03 crc kubenswrapper[4992]: I0131 09:47:03.403957 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5a817c88af66c2c79a0c6cbd7ebaa62005dbd5d7b1f51e26b4ea160117a6fe1" Jan 31 09:47:03 crc kubenswrapper[4992]: I0131 09:47:03.429708 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:47:03 crc kubenswrapper[4992]: I0131 09:47:03.512113 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9638b204-4079-42e5-9ee7-3cba97d67894-config-data\") pod \"9638b204-4079-42e5-9ee7-3cba97d67894\" (UID: \"9638b204-4079-42e5-9ee7-3cba97d67894\") " Jan 31 09:47:03 crc kubenswrapper[4992]: I0131 09:47:03.512241 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9638b204-4079-42e5-9ee7-3cba97d67894-combined-ca-bundle\") pod \"9638b204-4079-42e5-9ee7-3cba97d67894\" (UID: \"9638b204-4079-42e5-9ee7-3cba97d67894\") " Jan 31 09:47:03 crc kubenswrapper[4992]: I0131 09:47:03.512308 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjg4t\" (UniqueName: \"kubernetes.io/projected/9638b204-4079-42e5-9ee7-3cba97d67894-kube-api-access-qjg4t\") pod \"9638b204-4079-42e5-9ee7-3cba97d67894\" (UID: \"9638b204-4079-42e5-9ee7-3cba97d67894\") " Jan 31 09:47:03 crc kubenswrapper[4992]: I0131 09:47:03.519757 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9638b204-4079-42e5-9ee7-3cba97d67894-kube-api-access-qjg4t" (OuterVolumeSpecName: "kube-api-access-qjg4t") pod "9638b204-4079-42e5-9ee7-3cba97d67894" (UID: "9638b204-4079-42e5-9ee7-3cba97d67894"). InnerVolumeSpecName "kube-api-access-qjg4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:47:03 crc kubenswrapper[4992]: I0131 09:47:03.555120 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9638b204-4079-42e5-9ee7-3cba97d67894-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9638b204-4079-42e5-9ee7-3cba97d67894" (UID: "9638b204-4079-42e5-9ee7-3cba97d67894"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:03 crc kubenswrapper[4992]: I0131 09:47:03.560007 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9638b204-4079-42e5-9ee7-3cba97d67894-config-data" (OuterVolumeSpecName: "config-data") pod "9638b204-4079-42e5-9ee7-3cba97d67894" (UID: "9638b204-4079-42e5-9ee7-3cba97d67894"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:03 crc kubenswrapper[4992]: I0131 09:47:03.614633 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9638b204-4079-42e5-9ee7-3cba97d67894-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:03 crc kubenswrapper[4992]: I0131 09:47:03.614675 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9638b204-4079-42e5-9ee7-3cba97d67894-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:03 crc kubenswrapper[4992]: I0131 09:47:03.614692 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjg4t\" (UniqueName: \"kubernetes.io/projected/9638b204-4079-42e5-9ee7-3cba97d67894-kube-api-access-qjg4t\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.322829 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.414524 4992 generic.go:334] "Generic (PLEG): container finished" podID="d22ab2ab-a644-45a9-8f93-4f968450b1d0" containerID="e1ee4492e78ea8910058b5fa2cc30174873f24d044f5739f87296318881df258" exitCode=0 Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.414598 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.415127 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.415618 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d22ab2ab-a644-45a9-8f93-4f968450b1d0","Type":"ContainerDied","Data":"e1ee4492e78ea8910058b5fa2cc30174873f24d044f5739f87296318881df258"} Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.415658 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d22ab2ab-a644-45a9-8f93-4f968450b1d0","Type":"ContainerDied","Data":"706e12aaf26feb6701868a085254ed166990c1b66b1e0424943255f54ea9cb09"} Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.415681 4992 scope.go:117] "RemoveContainer" containerID="e1ee4492e78ea8910058b5fa2cc30174873f24d044f5739f87296318881df258" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.429948 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22ab2ab-a644-45a9-8f93-4f968450b1d0-logs\") pod \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.430027 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjwrx\" (UniqueName: \"kubernetes.io/projected/d22ab2ab-a644-45a9-8f93-4f968450b1d0-kube-api-access-kjwrx\") pod \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.430124 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22ab2ab-a644-45a9-8f93-4f968450b1d0-combined-ca-bundle\") pod \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.430203 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22ab2ab-a644-45a9-8f93-4f968450b1d0-config-data\") pod \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\" (UID: \"d22ab2ab-a644-45a9-8f93-4f968450b1d0\") " Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.431287 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d22ab2ab-a644-45a9-8f93-4f968450b1d0-logs" (OuterVolumeSpecName: "logs") pod "d22ab2ab-a644-45a9-8f93-4f968450b1d0" (UID: "d22ab2ab-a644-45a9-8f93-4f968450b1d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.437013 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d22ab2ab-a644-45a9-8f93-4f968450b1d0-kube-api-access-kjwrx" (OuterVolumeSpecName: "kube-api-access-kjwrx") pod "d22ab2ab-a644-45a9-8f93-4f968450b1d0" (UID: "d22ab2ab-a644-45a9-8f93-4f968450b1d0"). InnerVolumeSpecName "kube-api-access-kjwrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.439046 4992 scope.go:117] "RemoveContainer" containerID="dc83dfe0fdc0b0cfcd84d6f9ac1a6f275645ceffe5e5993c49a115019dc30b92" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.459147 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.469867 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22ab2ab-a644-45a9-8f93-4f968450b1d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d22ab2ab-a644-45a9-8f93-4f968450b1d0" (UID: "d22ab2ab-a644-45a9-8f93-4f968450b1d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.473389 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22ab2ab-a644-45a9-8f93-4f968450b1d0-config-data" (OuterVolumeSpecName: "config-data") pod "d22ab2ab-a644-45a9-8f93-4f968450b1d0" (UID: "d22ab2ab-a644-45a9-8f93-4f968450b1d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.480498 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.498471 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:47:04 crc kubenswrapper[4992]: E0131 09:47:04.498964 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d22ab2ab-a644-45a9-8f93-4f968450b1d0" containerName="nova-api-api" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.498980 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22ab2ab-a644-45a9-8f93-4f968450b1d0" containerName="nova-api-api" Jan 31 09:47:04 crc kubenswrapper[4992]: E0131 09:47:04.499007 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d22ab2ab-a644-45a9-8f93-4f968450b1d0" containerName="nova-api-log" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.499014 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22ab2ab-a644-45a9-8f93-4f968450b1d0" containerName="nova-api-log" Jan 31 09:47:04 crc kubenswrapper[4992]: E0131 09:47:04.499039 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9638b204-4079-42e5-9ee7-3cba97d67894" containerName="nova-scheduler-scheduler" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.499048 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9638b204-4079-42e5-9ee7-3cba97d67894" containerName="nova-scheduler-scheduler" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.499245 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d22ab2ab-a644-45a9-8f93-4f968450b1d0" containerName="nova-api-log" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.499268 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="9638b204-4079-42e5-9ee7-3cba97d67894" containerName="nova-scheduler-scheduler" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.499291 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d22ab2ab-a644-45a9-8f93-4f968450b1d0" containerName="nova-api-api" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.500043 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.501527 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.532701 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a803da-9736-4555-8851-5a87c3421592-config-data\") pod \"nova-scheduler-0\" (UID: \"d9a803da-9736-4555-8851-5a87c3421592\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.532770 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzzgk\" (UniqueName: \"kubernetes.io/projected/d9a803da-9736-4555-8851-5a87c3421592-kube-api-access-bzzgk\") pod \"nova-scheduler-0\" (UID: \"d9a803da-9736-4555-8851-5a87c3421592\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.532813 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a803da-9736-4555-8851-5a87c3421592-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d9a803da-9736-4555-8851-5a87c3421592\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.533299 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d22ab2ab-a644-45a9-8f93-4f968450b1d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.533311 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d22ab2ab-a644-45a9-8f93-4f968450b1d0-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.533319 4992 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d22ab2ab-a644-45a9-8f93-4f968450b1d0-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.533328 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjwrx\" (UniqueName: \"kubernetes.io/projected/d22ab2ab-a644-45a9-8f93-4f968450b1d0-kube-api-access-kjwrx\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.534461 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.551094 4992 scope.go:117] "RemoveContainer" containerID="e1ee4492e78ea8910058b5fa2cc30174873f24d044f5739f87296318881df258" Jan 31 09:47:04 crc kubenswrapper[4992]: E0131 09:47:04.552941 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1ee4492e78ea8910058b5fa2cc30174873f24d044f5739f87296318881df258\": container with ID starting with e1ee4492e78ea8910058b5fa2cc30174873f24d044f5739f87296318881df258 not found: ID does not exist" containerID="e1ee4492e78ea8910058b5fa2cc30174873f24d044f5739f87296318881df258" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.552982 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1ee4492e78ea8910058b5fa2cc30174873f24d044f5739f87296318881df258"} err="failed to get container status \"e1ee4492e78ea8910058b5fa2cc30174873f24d044f5739f87296318881df258\": rpc error: code = NotFound desc = could not find container \"e1ee4492e78ea8910058b5fa2cc30174873f24d044f5739f87296318881df258\": container with ID starting with e1ee4492e78ea8910058b5fa2cc30174873f24d044f5739f87296318881df258 not found: ID does not exist" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.553008 4992 scope.go:117] "RemoveContainer" containerID="dc83dfe0fdc0b0cfcd84d6f9ac1a6f275645ceffe5e5993c49a115019dc30b92" Jan 31 09:47:04 crc kubenswrapper[4992]: E0131 09:47:04.553374 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc83dfe0fdc0b0cfcd84d6f9ac1a6f275645ceffe5e5993c49a115019dc30b92\": container with ID starting with dc83dfe0fdc0b0cfcd84d6f9ac1a6f275645ceffe5e5993c49a115019dc30b92 not found: ID does not exist" containerID="dc83dfe0fdc0b0cfcd84d6f9ac1a6f275645ceffe5e5993c49a115019dc30b92" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.553459 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc83dfe0fdc0b0cfcd84d6f9ac1a6f275645ceffe5e5993c49a115019dc30b92"} err="failed to get container status \"dc83dfe0fdc0b0cfcd84d6f9ac1a6f275645ceffe5e5993c49a115019dc30b92\": rpc error: code = NotFound desc = could not find container \"dc83dfe0fdc0b0cfcd84d6f9ac1a6f275645ceffe5e5993c49a115019dc30b92\": container with ID starting with dc83dfe0fdc0b0cfcd84d6f9ac1a6f275645ceffe5e5993c49a115019dc30b92 not found: ID does not exist" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.634525 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a803da-9736-4555-8851-5a87c3421592-config-data\") pod \"nova-scheduler-0\" (UID: \"d9a803da-9736-4555-8851-5a87c3421592\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.634610 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzzgk\" (UniqueName: \"kubernetes.io/projected/d9a803da-9736-4555-8851-5a87c3421592-kube-api-access-bzzgk\") pod \"nova-scheduler-0\" (UID: \"d9a803da-9736-4555-8851-5a87c3421592\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.634639 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a803da-9736-4555-8851-5a87c3421592-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d9a803da-9736-4555-8851-5a87c3421592\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.638633 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a803da-9736-4555-8851-5a87c3421592-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d9a803da-9736-4555-8851-5a87c3421592\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.639054 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a803da-9736-4555-8851-5a87c3421592-config-data\") pod \"nova-scheduler-0\" (UID: \"d9a803da-9736-4555-8851-5a87c3421592\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.649307 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzzgk\" (UniqueName: \"kubernetes.io/projected/d9a803da-9736-4555-8851-5a87c3421592-kube-api-access-bzzgk\") pod \"nova-scheduler-0\" (UID: \"d9a803da-9736-4555-8851-5a87c3421592\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.755806 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.773750 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.787477 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.789064 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.792503 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.796783 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.838600 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d00a5534-142d-4b51-baf4-3c85d846f803-config-data\") pod \"nova-api-0\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " pod="openstack/nova-api-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.838967 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d00a5534-142d-4b51-baf4-3c85d846f803-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " pod="openstack/nova-api-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.839001 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g74k9\" (UniqueName: \"kubernetes.io/projected/d00a5534-142d-4b51-baf4-3c85d846f803-kube-api-access-g74k9\") pod \"nova-api-0\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " pod="openstack/nova-api-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.839039 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d00a5534-142d-4b51-baf4-3c85d846f803-logs\") pod \"nova-api-0\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " pod="openstack/nova-api-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.857244 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.940799 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d00a5534-142d-4b51-baf4-3c85d846f803-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " pod="openstack/nova-api-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.940870 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g74k9\" (UniqueName: \"kubernetes.io/projected/d00a5534-142d-4b51-baf4-3c85d846f803-kube-api-access-g74k9\") pod \"nova-api-0\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " pod="openstack/nova-api-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.940927 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d00a5534-142d-4b51-baf4-3c85d846f803-logs\") pod \"nova-api-0\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " pod="openstack/nova-api-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.941001 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d00a5534-142d-4b51-baf4-3c85d846f803-config-data\") pod \"nova-api-0\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " pod="openstack/nova-api-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.942358 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d00a5534-142d-4b51-baf4-3c85d846f803-logs\") pod \"nova-api-0\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " pod="openstack/nova-api-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.946329 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d00a5534-142d-4b51-baf4-3c85d846f803-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " pod="openstack/nova-api-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.947140 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d00a5534-142d-4b51-baf4-3c85d846f803-config-data\") pod \"nova-api-0\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " pod="openstack/nova-api-0" Jan 31 09:47:04 crc kubenswrapper[4992]: I0131 09:47:04.959919 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g74k9\" (UniqueName: \"kubernetes.io/projected/d00a5534-142d-4b51-baf4-3c85d846f803-kube-api-access-g74k9\") pod \"nova-api-0\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " pod="openstack/nova-api-0" Jan 31 09:47:05 crc kubenswrapper[4992]: I0131 09:47:05.033206 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 31 09:47:05 crc kubenswrapper[4992]: I0131 09:47:05.033270 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 31 09:47:05 crc kubenswrapper[4992]: I0131 09:47:05.108097 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:47:05 crc kubenswrapper[4992]: I0131 09:47:05.209616 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9638b204-4079-42e5-9ee7-3cba97d67894" path="/var/lib/kubelet/pods/9638b204-4079-42e5-9ee7-3cba97d67894/volumes" Jan 31 09:47:05 crc kubenswrapper[4992]: I0131 09:47:05.210544 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d22ab2ab-a644-45a9-8f93-4f968450b1d0" path="/var/lib/kubelet/pods/d22ab2ab-a644-45a9-8f93-4f968450b1d0/volumes" Jan 31 09:47:05 crc kubenswrapper[4992]: I0131 09:47:05.267085 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:47:05 crc kubenswrapper[4992]: W0131 09:47:05.271613 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9a803da_9736_4555_8851_5a87c3421592.slice/crio-877dea6aa3bcc5c0f8ce961c862f25a7658066cc1dd987631d403f2c2d9ebfc1 WatchSource:0}: Error finding container 877dea6aa3bcc5c0f8ce961c862f25a7658066cc1dd987631d403f2c2d9ebfc1: Status 404 returned error can't find the container with id 877dea6aa3bcc5c0f8ce961c862f25a7658066cc1dd987631d403f2c2d9ebfc1 Jan 31 09:47:05 crc kubenswrapper[4992]: I0131 09:47:05.424603 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9a803da-9736-4555-8851-5a87c3421592","Type":"ContainerStarted","Data":"877dea6aa3bcc5c0f8ce961c862f25a7658066cc1dd987631d403f2c2d9ebfc1"} Jan 31 09:47:05 crc kubenswrapper[4992]: I0131 09:47:05.545973 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:05 crc kubenswrapper[4992]: W0131 09:47:05.547043 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd00a5534_142d_4b51_baf4_3c85d846f803.slice/crio-57f4fadb4ece04f01331745ac14616812fa4983ac20fb313119e33149dcd4057 WatchSource:0}: Error finding container 57f4fadb4ece04f01331745ac14616812fa4983ac20fb313119e33149dcd4057: Status 404 returned error can't find the container with id 57f4fadb4ece04f01331745ac14616812fa4983ac20fb313119e33149dcd4057 Jan 31 09:47:06 crc kubenswrapper[4992]: I0131 09:47:06.452457 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d00a5534-142d-4b51-baf4-3c85d846f803","Type":"ContainerStarted","Data":"5cac9d95bc82e2d51776d256dac064169102eaaca9a3ae5ace008fb525625030"} Jan 31 09:47:06 crc kubenswrapper[4992]: I0131 09:47:06.452805 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d00a5534-142d-4b51-baf4-3c85d846f803","Type":"ContainerStarted","Data":"ca5f422ac76dd2e5013898dbb5db9291a9b5967dbaa43da4931c413dc992594b"} Jan 31 09:47:06 crc kubenswrapper[4992]: I0131 09:47:06.452821 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d00a5534-142d-4b51-baf4-3c85d846f803","Type":"ContainerStarted","Data":"57f4fadb4ece04f01331745ac14616812fa4983ac20fb313119e33149dcd4057"} Jan 31 09:47:06 crc kubenswrapper[4992]: I0131 09:47:06.454709 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9a803da-9736-4555-8851-5a87c3421592","Type":"ContainerStarted","Data":"5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5"} Jan 31 09:47:06 crc kubenswrapper[4992]: I0131 09:47:06.507785 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.507763254 podStartE2EDuration="2.507763254s" podCreationTimestamp="2026-01-31 09:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:47:06.504664717 +0000 UTC m=+1322.476056724" watchObservedRunningTime="2026-01-31 09:47:06.507763254 +0000 UTC m=+1322.479155241" Jan 31 09:47:06 crc kubenswrapper[4992]: I0131 09:47:06.510993 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5109828849999998 podStartE2EDuration="2.510982885s" podCreationTimestamp="2026-01-31 09:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:47:06.48490975 +0000 UTC m=+1322.456301747" watchObservedRunningTime="2026-01-31 09:47:06.510982885 +0000 UTC m=+1322.482374872" Jan 31 09:47:08 crc kubenswrapper[4992]: I0131 09:47:08.742678 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 31 09:47:09 crc kubenswrapper[4992]: I0131 09:47:09.857486 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 31 09:47:10 crc kubenswrapper[4992]: I0131 09:47:10.033876 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 31 09:47:10 crc kubenswrapper[4992]: I0131 09:47:10.033951 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 31 09:47:11 crc kubenswrapper[4992]: I0131 09:47:11.046738 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d91af037-5b1b-4543-810e-06667d38a865" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 31 09:47:11 crc kubenswrapper[4992]: I0131 09:47:11.046776 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d91af037-5b1b-4543-810e-06667d38a865" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 31 09:47:14 crc kubenswrapper[4992]: I0131 09:47:14.858515 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 31 09:47:14 crc kubenswrapper[4992]: I0131 09:47:14.896906 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 31 09:47:15 crc kubenswrapper[4992]: I0131 09:47:15.109787 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 31 09:47:15 crc kubenswrapper[4992]: I0131 09:47:15.109876 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 31 09:47:15 crc kubenswrapper[4992]: I0131 09:47:15.574974 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 31 09:47:16 crc kubenswrapper[4992]: I0131 09:47:16.192588 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d00a5534-142d-4b51-baf4-3c85d846f803" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 31 09:47:16 crc kubenswrapper[4992]: I0131 09:47:16.192594 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d00a5534-142d-4b51-baf4-3c85d846f803" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 31 09:47:20 crc kubenswrapper[4992]: I0131 09:47:20.040217 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 31 09:47:20 crc kubenswrapper[4992]: I0131 09:47:20.055620 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 31 09:47:20 crc kubenswrapper[4992]: I0131 09:47:20.056724 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 31 09:47:20 crc kubenswrapper[4992]: I0131 09:47:20.595921 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.607354 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.608206 4992 generic.go:334] "Generic (PLEG): container finished" podID="215f5e28-cf6d-4221-ba1d-25d894544fcd" containerID="09c9850174a459563d3a4ab9bdf25a941e0ac7dbb69c9b81d08f5dc666be6053" exitCode=137 Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.608286 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"215f5e28-cf6d-4221-ba1d-25d894544fcd","Type":"ContainerDied","Data":"09c9850174a459563d3a4ab9bdf25a941e0ac7dbb69c9b81d08f5dc666be6053"} Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.608327 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"215f5e28-cf6d-4221-ba1d-25d894544fcd","Type":"ContainerDied","Data":"fec23e07e4ab9042c92fdac36824ab94383b1ac7cd7e8abae60e9ccc636f60df"} Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.608348 4992 scope.go:117] "RemoveContainer" containerID="09c9850174a459563d3a4ab9bdf25a941e0ac7dbb69c9b81d08f5dc666be6053" Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.639308 4992 scope.go:117] "RemoveContainer" containerID="09c9850174a459563d3a4ab9bdf25a941e0ac7dbb69c9b81d08f5dc666be6053" Jan 31 09:47:22 crc kubenswrapper[4992]: E0131 09:47:22.639770 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c9850174a459563d3a4ab9bdf25a941e0ac7dbb69c9b81d08f5dc666be6053\": container with ID starting with 09c9850174a459563d3a4ab9bdf25a941e0ac7dbb69c9b81d08f5dc666be6053 not found: ID does not exist" containerID="09c9850174a459563d3a4ab9bdf25a941e0ac7dbb69c9b81d08f5dc666be6053" Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.639800 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c9850174a459563d3a4ab9bdf25a941e0ac7dbb69c9b81d08f5dc666be6053"} err="failed to get container status \"09c9850174a459563d3a4ab9bdf25a941e0ac7dbb69c9b81d08f5dc666be6053\": rpc error: code = NotFound desc = could not find container \"09c9850174a459563d3a4ab9bdf25a941e0ac7dbb69c9b81d08f5dc666be6053\": container with ID starting with 09c9850174a459563d3a4ab9bdf25a941e0ac7dbb69c9b81d08f5dc666be6053 not found: ID does not exist" Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.685468 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29p9c\" (UniqueName: \"kubernetes.io/projected/215f5e28-cf6d-4221-ba1d-25d894544fcd-kube-api-access-29p9c\") pod \"215f5e28-cf6d-4221-ba1d-25d894544fcd\" (UID: \"215f5e28-cf6d-4221-ba1d-25d894544fcd\") " Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.685975 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215f5e28-cf6d-4221-ba1d-25d894544fcd-config-data\") pod \"215f5e28-cf6d-4221-ba1d-25d894544fcd\" (UID: \"215f5e28-cf6d-4221-ba1d-25d894544fcd\") " Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.686072 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215f5e28-cf6d-4221-ba1d-25d894544fcd-combined-ca-bundle\") pod \"215f5e28-cf6d-4221-ba1d-25d894544fcd\" (UID: \"215f5e28-cf6d-4221-ba1d-25d894544fcd\") " Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.695595 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215f5e28-cf6d-4221-ba1d-25d894544fcd-kube-api-access-29p9c" (OuterVolumeSpecName: "kube-api-access-29p9c") pod "215f5e28-cf6d-4221-ba1d-25d894544fcd" (UID: "215f5e28-cf6d-4221-ba1d-25d894544fcd"). InnerVolumeSpecName "kube-api-access-29p9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.713908 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/215f5e28-cf6d-4221-ba1d-25d894544fcd-config-data" (OuterVolumeSpecName: "config-data") pod "215f5e28-cf6d-4221-ba1d-25d894544fcd" (UID: "215f5e28-cf6d-4221-ba1d-25d894544fcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.715476 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/215f5e28-cf6d-4221-ba1d-25d894544fcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "215f5e28-cf6d-4221-ba1d-25d894544fcd" (UID: "215f5e28-cf6d-4221-ba1d-25d894544fcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.787997 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/215f5e28-cf6d-4221-ba1d-25d894544fcd-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.788214 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215f5e28-cf6d-4221-ba1d-25d894544fcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:22 crc kubenswrapper[4992]: I0131 09:47:22.788285 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29p9c\" (UniqueName: \"kubernetes.io/projected/215f5e28-cf6d-4221-ba1d-25d894544fcd-kube-api-access-29p9c\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.616848 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.642984 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.663394 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.671305 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:47:23 crc kubenswrapper[4992]: E0131 09:47:23.671848 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="215f5e28-cf6d-4221-ba1d-25d894544fcd" containerName="nova-cell1-novncproxy-novncproxy" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.671870 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="215f5e28-cf6d-4221-ba1d-25d894544fcd" containerName="nova-cell1-novncproxy-novncproxy" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.672101 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="215f5e28-cf6d-4221-ba1d-25d894544fcd" containerName="nova-cell1-novncproxy-novncproxy" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.672862 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.675050 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.675601 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.676016 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.687391 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.805629 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc32173-2a8d-436e-84b1-bc687b7d8e23-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bcc32173-2a8d-436e-84b1-bc687b7d8e23\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.805684 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc32173-2a8d-436e-84b1-bc687b7d8e23-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bcc32173-2a8d-436e-84b1-bc687b7d8e23\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.805724 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc32173-2a8d-436e-84b1-bc687b7d8e23-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bcc32173-2a8d-436e-84b1-bc687b7d8e23\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.805898 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc32173-2a8d-436e-84b1-bc687b7d8e23-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bcc32173-2a8d-436e-84b1-bc687b7d8e23\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.805959 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2s87\" (UniqueName: \"kubernetes.io/projected/bcc32173-2a8d-436e-84b1-bc687b7d8e23-kube-api-access-r2s87\") pod \"nova-cell1-novncproxy-0\" (UID: \"bcc32173-2a8d-436e-84b1-bc687b7d8e23\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.907711 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc32173-2a8d-436e-84b1-bc687b7d8e23-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bcc32173-2a8d-436e-84b1-bc687b7d8e23\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.908065 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc32173-2a8d-436e-84b1-bc687b7d8e23-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bcc32173-2a8d-436e-84b1-bc687b7d8e23\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.908219 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc32173-2a8d-436e-84b1-bc687b7d8e23-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bcc32173-2a8d-436e-84b1-bc687b7d8e23\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.908332 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc32173-2a8d-436e-84b1-bc687b7d8e23-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bcc32173-2a8d-436e-84b1-bc687b7d8e23\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.908468 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2s87\" (UniqueName: \"kubernetes.io/projected/bcc32173-2a8d-436e-84b1-bc687b7d8e23-kube-api-access-r2s87\") pod \"nova-cell1-novncproxy-0\" (UID: \"bcc32173-2a8d-436e-84b1-bc687b7d8e23\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.914048 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc32173-2a8d-436e-84b1-bc687b7d8e23-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bcc32173-2a8d-436e-84b1-bc687b7d8e23\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.914658 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc32173-2a8d-436e-84b1-bc687b7d8e23-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bcc32173-2a8d-436e-84b1-bc687b7d8e23\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.914680 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc32173-2a8d-436e-84b1-bc687b7d8e23-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bcc32173-2a8d-436e-84b1-bc687b7d8e23\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.915873 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc32173-2a8d-436e-84b1-bc687b7d8e23-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bcc32173-2a8d-436e-84b1-bc687b7d8e23\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.927998 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2s87\" (UniqueName: \"kubernetes.io/projected/bcc32173-2a8d-436e-84b1-bc687b7d8e23-kube-api-access-r2s87\") pod \"nova-cell1-novncproxy-0\" (UID: \"bcc32173-2a8d-436e-84b1-bc687b7d8e23\") " pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:23 crc kubenswrapper[4992]: I0131 09:47:23.989150 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:24 crc kubenswrapper[4992]: I0131 09:47:24.426328 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 31 09:47:24 crc kubenswrapper[4992]: W0131 09:47:24.426392 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcc32173_2a8d_436e_84b1_bc687b7d8e23.slice/crio-3b471b6f5f17ce37fbcd79c72a095217917f85037c6e1c0163a347c7945afdce WatchSource:0}: Error finding container 3b471b6f5f17ce37fbcd79c72a095217917f85037c6e1c0163a347c7945afdce: Status 404 returned error can't find the container with id 3b471b6f5f17ce37fbcd79c72a095217917f85037c6e1c0163a347c7945afdce Jan 31 09:47:24 crc kubenswrapper[4992]: I0131 09:47:24.625886 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bcc32173-2a8d-436e-84b1-bc687b7d8e23","Type":"ContainerStarted","Data":"e66674c9f629c475feacbe6c09eef0f1e25f36876545a18871a571aae9b29509"} Jan 31 09:47:24 crc kubenswrapper[4992]: I0131 09:47:24.626173 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bcc32173-2a8d-436e-84b1-bc687b7d8e23","Type":"ContainerStarted","Data":"3b471b6f5f17ce37fbcd79c72a095217917f85037c6e1c0163a347c7945afdce"} Jan 31 09:47:24 crc kubenswrapper[4992]: I0131 09:47:24.651277 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.6512505709999998 podStartE2EDuration="1.651250571s" podCreationTimestamp="2026-01-31 09:47:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:47:24.650285404 +0000 UTC m=+1340.621677431" watchObservedRunningTime="2026-01-31 09:47:24.651250571 +0000 UTC m=+1340.622642598" Jan 31 09:47:25 crc kubenswrapper[4992]: I0131 09:47:25.113794 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 31 09:47:25 crc kubenswrapper[4992]: I0131 09:47:25.114242 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 31 09:47:25 crc kubenswrapper[4992]: I0131 09:47:25.114488 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 31 09:47:25 crc kubenswrapper[4992]: I0131 09:47:25.119682 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 31 09:47:25 crc kubenswrapper[4992]: I0131 09:47:25.203781 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="215f5e28-cf6d-4221-ba1d-25d894544fcd" path="/var/lib/kubelet/pods/215f5e28-cf6d-4221-ba1d-25d894544fcd/volumes" Jan 31 09:47:25 crc kubenswrapper[4992]: I0131 09:47:25.634868 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 31 09:47:25 crc kubenswrapper[4992]: I0131 09:47:25.638091 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 31 09:47:25 crc kubenswrapper[4992]: I0131 09:47:25.818792 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-lnws2"] Jan 31 09:47:25 crc kubenswrapper[4992]: I0131 09:47:25.820171 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:25 crc kubenswrapper[4992]: I0131 09:47:25.851289 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-lnws2"] Jan 31 09:47:25 crc kubenswrapper[4992]: I0131 09:47:25.948203 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-lnws2\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:25 crc kubenswrapper[4992]: I0131 09:47:25.948278 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-lnws2\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:25 crc kubenswrapper[4992]: I0131 09:47:25.948441 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6mx7\" (UniqueName: \"kubernetes.io/projected/5e4678f5-97cd-4850-92f1-486ae4ddafda-kube-api-access-v6mx7\") pod \"dnsmasq-dns-68d4b6d797-lnws2\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:25 crc kubenswrapper[4992]: I0131 09:47:25.948520 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-config\") pod \"dnsmasq-dns-68d4b6d797-lnws2\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:25 crc kubenswrapper[4992]: I0131 09:47:25.948546 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-lnws2\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:26 crc kubenswrapper[4992]: I0131 09:47:26.050847 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-lnws2\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:26 crc kubenswrapper[4992]: I0131 09:47:26.050902 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-lnws2\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:26 crc kubenswrapper[4992]: I0131 09:47:26.050963 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6mx7\" (UniqueName: \"kubernetes.io/projected/5e4678f5-97cd-4850-92f1-486ae4ddafda-kube-api-access-v6mx7\") pod \"dnsmasq-dns-68d4b6d797-lnws2\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:26 crc kubenswrapper[4992]: I0131 09:47:26.051012 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-config\") pod \"dnsmasq-dns-68d4b6d797-lnws2\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:26 crc kubenswrapper[4992]: I0131 09:47:26.051038 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-lnws2\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:26 crc kubenswrapper[4992]: I0131 09:47:26.051952 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-lnws2\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:26 crc kubenswrapper[4992]: I0131 09:47:26.051963 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-lnws2\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:26 crc kubenswrapper[4992]: I0131 09:47:26.052075 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-lnws2\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:26 crc kubenswrapper[4992]: I0131 09:47:26.052675 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-config\") pod \"dnsmasq-dns-68d4b6d797-lnws2\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:26 crc kubenswrapper[4992]: I0131 09:47:26.073110 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6mx7\" (UniqueName: \"kubernetes.io/projected/5e4678f5-97cd-4850-92f1-486ae4ddafda-kube-api-access-v6mx7\") pod \"dnsmasq-dns-68d4b6d797-lnws2\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:26 crc kubenswrapper[4992]: I0131 09:47:26.156172 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:26 crc kubenswrapper[4992]: I0131 09:47:26.639652 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-lnws2"] Jan 31 09:47:27 crc kubenswrapper[4992]: I0131 09:47:27.671502 4992 generic.go:334] "Generic (PLEG): container finished" podID="5e4678f5-97cd-4850-92f1-486ae4ddafda" containerID="bc53b673f0d554e010ea529addf1e8a8504d1ee1b837dc48c1c9d2a37e332347" exitCode=0 Jan 31 09:47:27 crc kubenswrapper[4992]: I0131 09:47:27.671556 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" event={"ID":"5e4678f5-97cd-4850-92f1-486ae4ddafda","Type":"ContainerDied","Data":"bc53b673f0d554e010ea529addf1e8a8504d1ee1b837dc48c1c9d2a37e332347"} Jan 31 09:47:27 crc kubenswrapper[4992]: I0131 09:47:27.672025 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" event={"ID":"5e4678f5-97cd-4850-92f1-486ae4ddafda","Type":"ContainerStarted","Data":"68eba168a4a0ea59a3436b4d95a2884e8520680bf22e399f3db163833aa0859d"} Jan 31 09:47:27 crc kubenswrapper[4992]: I0131 09:47:27.971582 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.126788 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.127111 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="ceilometer-central-agent" containerID="cri-o://727919957ab28db2bf007f072728a920d1dcfe88af1797168fce28ae654ff27b" gracePeriod=30 Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.127190 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="proxy-httpd" containerID="cri-o://f6e5016f2f0a2c211edbbc9e7e22d9a0d0486b229b93a9f2596adfb4f0a46ce4" gracePeriod=30 Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.127228 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="ceilometer-notification-agent" containerID="cri-o://d07f4fd023a0f591608a8f30a49ef56a8de0c5a47a3c2d22e0bda8d02168d22e" gracePeriod=30 Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.127230 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="sg-core" containerID="cri-o://431505c3f1b6b076814eca1233194bd54942880895afb3ffad41d351626ab0dd" gracePeriod=30 Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.680754 4992 generic.go:334] "Generic (PLEG): container finished" podID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerID="f6e5016f2f0a2c211edbbc9e7e22d9a0d0486b229b93a9f2596adfb4f0a46ce4" exitCode=0 Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.680784 4992 generic.go:334] "Generic (PLEG): container finished" podID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerID="431505c3f1b6b076814eca1233194bd54942880895afb3ffad41d351626ab0dd" exitCode=2 Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.680791 4992 generic.go:334] "Generic (PLEG): container finished" podID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerID="727919957ab28db2bf007f072728a920d1dcfe88af1797168fce28ae654ff27b" exitCode=0 Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.680825 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76229fae-f3ae-4090-89a2-43780cf2f2ba","Type":"ContainerDied","Data":"f6e5016f2f0a2c211edbbc9e7e22d9a0d0486b229b93a9f2596adfb4f0a46ce4"} Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.680848 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76229fae-f3ae-4090-89a2-43780cf2f2ba","Type":"ContainerDied","Data":"431505c3f1b6b076814eca1233194bd54942880895afb3ffad41d351626ab0dd"} Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.680856 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76229fae-f3ae-4090-89a2-43780cf2f2ba","Type":"ContainerDied","Data":"727919957ab28db2bf007f072728a920d1dcfe88af1797168fce28ae654ff27b"} Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.682939 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d00a5534-142d-4b51-baf4-3c85d846f803" containerName="nova-api-log" containerID="cri-o://ca5f422ac76dd2e5013898dbb5db9291a9b5967dbaa43da4931c413dc992594b" gracePeriod=30 Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.683623 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" event={"ID":"5e4678f5-97cd-4850-92f1-486ae4ddafda","Type":"ContainerStarted","Data":"a625a955c99c2209013d94cd55943bb26dcb71830fd84c73b560a9998d348495"} Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.683630 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d00a5534-142d-4b51-baf4-3c85d846f803" containerName="nova-api-api" containerID="cri-o://5cac9d95bc82e2d51776d256dac064169102eaaca9a3ae5ace008fb525625030" gracePeriod=30 Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.683712 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.715466 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" podStartSLOduration=3.7154472910000003 podStartE2EDuration="3.715447291s" podCreationTimestamp="2026-01-31 09:47:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:47:28.706756526 +0000 UTC m=+1344.678148533" watchObservedRunningTime="2026-01-31 09:47:28.715447291 +0000 UTC m=+1344.686839288" Jan 31 09:47:28 crc kubenswrapper[4992]: I0131 09:47:28.990219 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.497495 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.609274 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-scripts\") pod \"76229fae-f3ae-4090-89a2-43780cf2f2ba\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.609375 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-ceilometer-tls-certs\") pod \"76229fae-f3ae-4090-89a2-43780cf2f2ba\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.609409 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-sg-core-conf-yaml\") pod \"76229fae-f3ae-4090-89a2-43780cf2f2ba\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.609494 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-combined-ca-bundle\") pod \"76229fae-f3ae-4090-89a2-43780cf2f2ba\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.609540 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76229fae-f3ae-4090-89a2-43780cf2f2ba-log-httpd\") pod \"76229fae-f3ae-4090-89a2-43780cf2f2ba\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.610105 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76229fae-f3ae-4090-89a2-43780cf2f2ba-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "76229fae-f3ae-4090-89a2-43780cf2f2ba" (UID: "76229fae-f3ae-4090-89a2-43780cf2f2ba"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.610148 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76229fae-f3ae-4090-89a2-43780cf2f2ba-run-httpd\") pod \"76229fae-f3ae-4090-89a2-43780cf2f2ba\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.610205 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-config-data\") pod \"76229fae-f3ae-4090-89a2-43780cf2f2ba\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.610242 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdlk4\" (UniqueName: \"kubernetes.io/projected/76229fae-f3ae-4090-89a2-43780cf2f2ba-kube-api-access-jdlk4\") pod \"76229fae-f3ae-4090-89a2-43780cf2f2ba\" (UID: \"76229fae-f3ae-4090-89a2-43780cf2f2ba\") " Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.610270 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76229fae-f3ae-4090-89a2-43780cf2f2ba-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "76229fae-f3ae-4090-89a2-43780cf2f2ba" (UID: "76229fae-f3ae-4090-89a2-43780cf2f2ba"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.610598 4992 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76229fae-f3ae-4090-89a2-43780cf2f2ba-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.610612 4992 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76229fae-f3ae-4090-89a2-43780cf2f2ba-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.614694 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-scripts" (OuterVolumeSpecName: "scripts") pod "76229fae-f3ae-4090-89a2-43780cf2f2ba" (UID: "76229fae-f3ae-4090-89a2-43780cf2f2ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.616174 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76229fae-f3ae-4090-89a2-43780cf2f2ba-kube-api-access-jdlk4" (OuterVolumeSpecName: "kube-api-access-jdlk4") pod "76229fae-f3ae-4090-89a2-43780cf2f2ba" (UID: "76229fae-f3ae-4090-89a2-43780cf2f2ba"). InnerVolumeSpecName "kube-api-access-jdlk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.641668 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "76229fae-f3ae-4090-89a2-43780cf2f2ba" (UID: "76229fae-f3ae-4090-89a2-43780cf2f2ba"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.668862 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "76229fae-f3ae-4090-89a2-43780cf2f2ba" (UID: "76229fae-f3ae-4090-89a2-43780cf2f2ba"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.700961 4992 generic.go:334] "Generic (PLEG): container finished" podID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerID="d07f4fd023a0f591608a8f30a49ef56a8de0c5a47a3c2d22e0bda8d02168d22e" exitCode=0 Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.701033 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.701024 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76229fae-f3ae-4090-89a2-43780cf2f2ba","Type":"ContainerDied","Data":"d07f4fd023a0f591608a8f30a49ef56a8de0c5a47a3c2d22e0bda8d02168d22e"} Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.701184 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76229fae-f3ae-4090-89a2-43780cf2f2ba","Type":"ContainerDied","Data":"6a3ac477702cee8d8d1aee11ed0559e13ad36400803d6991787d80b736cc1099"} Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.701214 4992 scope.go:117] "RemoveContainer" containerID="f6e5016f2f0a2c211edbbc9e7e22d9a0d0486b229b93a9f2596adfb4f0a46ce4" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.703927 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76229fae-f3ae-4090-89a2-43780cf2f2ba" (UID: "76229fae-f3ae-4090-89a2-43780cf2f2ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.704962 4992 generic.go:334] "Generic (PLEG): container finished" podID="d00a5534-142d-4b51-baf4-3c85d846f803" containerID="ca5f422ac76dd2e5013898dbb5db9291a9b5967dbaa43da4931c413dc992594b" exitCode=143 Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.705041 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d00a5534-142d-4b51-baf4-3c85d846f803","Type":"ContainerDied","Data":"ca5f422ac76dd2e5013898dbb5db9291a9b5967dbaa43da4931c413dc992594b"} Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.707825 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-config-data" (OuterVolumeSpecName: "config-data") pod "76229fae-f3ae-4090-89a2-43780cf2f2ba" (UID: "76229fae-f3ae-4090-89a2-43780cf2f2ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.712125 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.712160 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdlk4\" (UniqueName: \"kubernetes.io/projected/76229fae-f3ae-4090-89a2-43780cf2f2ba-kube-api-access-jdlk4\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.712173 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.712185 4992 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.712199 4992 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.712209 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76229fae-f3ae-4090-89a2-43780cf2f2ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.760315 4992 scope.go:117] "RemoveContainer" containerID="431505c3f1b6b076814eca1233194bd54942880895afb3ffad41d351626ab0dd" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.776775 4992 scope.go:117] "RemoveContainer" containerID="d07f4fd023a0f591608a8f30a49ef56a8de0c5a47a3c2d22e0bda8d02168d22e" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.798279 4992 scope.go:117] "RemoveContainer" containerID="727919957ab28db2bf007f072728a920d1dcfe88af1797168fce28ae654ff27b" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.816769 4992 scope.go:117] "RemoveContainer" containerID="f6e5016f2f0a2c211edbbc9e7e22d9a0d0486b229b93a9f2596adfb4f0a46ce4" Jan 31 09:47:29 crc kubenswrapper[4992]: E0131 09:47:29.817256 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6e5016f2f0a2c211edbbc9e7e22d9a0d0486b229b93a9f2596adfb4f0a46ce4\": container with ID starting with f6e5016f2f0a2c211edbbc9e7e22d9a0d0486b229b93a9f2596adfb4f0a46ce4 not found: ID does not exist" containerID="f6e5016f2f0a2c211edbbc9e7e22d9a0d0486b229b93a9f2596adfb4f0a46ce4" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.817304 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6e5016f2f0a2c211edbbc9e7e22d9a0d0486b229b93a9f2596adfb4f0a46ce4"} err="failed to get container status \"f6e5016f2f0a2c211edbbc9e7e22d9a0d0486b229b93a9f2596adfb4f0a46ce4\": rpc error: code = NotFound desc = could not find container \"f6e5016f2f0a2c211edbbc9e7e22d9a0d0486b229b93a9f2596adfb4f0a46ce4\": container with ID starting with f6e5016f2f0a2c211edbbc9e7e22d9a0d0486b229b93a9f2596adfb4f0a46ce4 not found: ID does not exist" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.817338 4992 scope.go:117] "RemoveContainer" containerID="431505c3f1b6b076814eca1233194bd54942880895afb3ffad41d351626ab0dd" Jan 31 09:47:29 crc kubenswrapper[4992]: E0131 09:47:29.817900 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"431505c3f1b6b076814eca1233194bd54942880895afb3ffad41d351626ab0dd\": container with ID starting with 431505c3f1b6b076814eca1233194bd54942880895afb3ffad41d351626ab0dd not found: ID does not exist" containerID="431505c3f1b6b076814eca1233194bd54942880895afb3ffad41d351626ab0dd" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.817936 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"431505c3f1b6b076814eca1233194bd54942880895afb3ffad41d351626ab0dd"} err="failed to get container status \"431505c3f1b6b076814eca1233194bd54942880895afb3ffad41d351626ab0dd\": rpc error: code = NotFound desc = could not find container \"431505c3f1b6b076814eca1233194bd54942880895afb3ffad41d351626ab0dd\": container with ID starting with 431505c3f1b6b076814eca1233194bd54942880895afb3ffad41d351626ab0dd not found: ID does not exist" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.817955 4992 scope.go:117] "RemoveContainer" containerID="d07f4fd023a0f591608a8f30a49ef56a8de0c5a47a3c2d22e0bda8d02168d22e" Jan 31 09:47:29 crc kubenswrapper[4992]: E0131 09:47:29.818306 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d07f4fd023a0f591608a8f30a49ef56a8de0c5a47a3c2d22e0bda8d02168d22e\": container with ID starting with d07f4fd023a0f591608a8f30a49ef56a8de0c5a47a3c2d22e0bda8d02168d22e not found: ID does not exist" containerID="d07f4fd023a0f591608a8f30a49ef56a8de0c5a47a3c2d22e0bda8d02168d22e" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.818335 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d07f4fd023a0f591608a8f30a49ef56a8de0c5a47a3c2d22e0bda8d02168d22e"} err="failed to get container status \"d07f4fd023a0f591608a8f30a49ef56a8de0c5a47a3c2d22e0bda8d02168d22e\": rpc error: code = NotFound desc = could not find container \"d07f4fd023a0f591608a8f30a49ef56a8de0c5a47a3c2d22e0bda8d02168d22e\": container with ID starting with d07f4fd023a0f591608a8f30a49ef56a8de0c5a47a3c2d22e0bda8d02168d22e not found: ID does not exist" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.818356 4992 scope.go:117] "RemoveContainer" containerID="727919957ab28db2bf007f072728a920d1dcfe88af1797168fce28ae654ff27b" Jan 31 09:47:29 crc kubenswrapper[4992]: E0131 09:47:29.818710 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"727919957ab28db2bf007f072728a920d1dcfe88af1797168fce28ae654ff27b\": container with ID starting with 727919957ab28db2bf007f072728a920d1dcfe88af1797168fce28ae654ff27b not found: ID does not exist" containerID="727919957ab28db2bf007f072728a920d1dcfe88af1797168fce28ae654ff27b" Jan 31 09:47:29 crc kubenswrapper[4992]: I0131 09:47:29.818733 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"727919957ab28db2bf007f072728a920d1dcfe88af1797168fce28ae654ff27b"} err="failed to get container status \"727919957ab28db2bf007f072728a920d1dcfe88af1797168fce28ae654ff27b\": rpc error: code = NotFound desc = could not find container \"727919957ab28db2bf007f072728a920d1dcfe88af1797168fce28ae654ff27b\": container with ID starting with 727919957ab28db2bf007f072728a920d1dcfe88af1797168fce28ae654ff27b not found: ID does not exist" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.046200 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.060401 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.068177 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:47:30 crc kubenswrapper[4992]: E0131 09:47:30.068563 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="ceilometer-notification-agent" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.068582 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="ceilometer-notification-agent" Jan 31 09:47:30 crc kubenswrapper[4992]: E0131 09:47:30.068598 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="proxy-httpd" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.068604 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="proxy-httpd" Jan 31 09:47:30 crc kubenswrapper[4992]: E0131 09:47:30.068616 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="ceilometer-central-agent" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.068623 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="ceilometer-central-agent" Jan 31 09:47:30 crc kubenswrapper[4992]: E0131 09:47:30.068643 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="sg-core" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.068648 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="sg-core" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.068810 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="ceilometer-central-agent" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.068825 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="sg-core" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.068834 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="ceilometer-notification-agent" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.068843 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" containerName="proxy-httpd" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.070313 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.074040 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.074085 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.075351 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.079339 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.220031 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eca84ee-5e16-4802-b7d5-88df97d8787b-log-httpd\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.220128 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.220212 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-scripts\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.220276 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.220299 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eca84ee-5e16-4802-b7d5-88df97d8787b-run-httpd\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.220322 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t72vx\" (UniqueName: \"kubernetes.io/projected/8eca84ee-5e16-4802-b7d5-88df97d8787b-kube-api-access-t72vx\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.220684 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.220845 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-config-data\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.322609 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-config-data\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.322722 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eca84ee-5e16-4802-b7d5-88df97d8787b-log-httpd\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.322798 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.322823 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-scripts\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.322846 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.322864 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eca84ee-5e16-4802-b7d5-88df97d8787b-run-httpd\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.322879 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t72vx\" (UniqueName: \"kubernetes.io/projected/8eca84ee-5e16-4802-b7d5-88df97d8787b-kube-api-access-t72vx\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.322986 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.323296 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eca84ee-5e16-4802-b7d5-88df97d8787b-log-httpd\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.323448 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eca84ee-5e16-4802-b7d5-88df97d8787b-run-httpd\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.327764 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.328615 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-scripts\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.330236 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-config-data\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.332029 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.332169 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.346147 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t72vx\" (UniqueName: \"kubernetes.io/projected/8eca84ee-5e16-4802-b7d5-88df97d8787b-kube-api-access-t72vx\") pod \"ceilometer-0\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.400739 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 09:47:30 crc kubenswrapper[4992]: I0131 09:47:30.889715 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 09:47:31 crc kubenswrapper[4992]: I0131 09:47:31.195260 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76229fae-f3ae-4090-89a2-43780cf2f2ba" path="/var/lib/kubelet/pods/76229fae-f3ae-4090-89a2-43780cf2f2ba/volumes" Jan 31 09:47:31 crc kubenswrapper[4992]: I0131 09:47:31.725347 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eca84ee-5e16-4802-b7d5-88df97d8787b","Type":"ContainerStarted","Data":"25f461553e711b8fd0eea5fd5f8dfb1af4c0f6d2cd7def1af3d1291e940fcfdd"} Jan 31 09:47:31 crc kubenswrapper[4992]: I0131 09:47:31.725747 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eca84ee-5e16-4802-b7d5-88df97d8787b","Type":"ContainerStarted","Data":"f592a23950e0a35401e2ad2469d91f5ffa927e5897470ec6221c624f1b70cbf2"} Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.216844 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.369147 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g74k9\" (UniqueName: \"kubernetes.io/projected/d00a5534-142d-4b51-baf4-3c85d846f803-kube-api-access-g74k9\") pod \"d00a5534-142d-4b51-baf4-3c85d846f803\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.369526 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d00a5534-142d-4b51-baf4-3c85d846f803-logs\") pod \"d00a5534-142d-4b51-baf4-3c85d846f803\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.369569 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d00a5534-142d-4b51-baf4-3c85d846f803-combined-ca-bundle\") pod \"d00a5534-142d-4b51-baf4-3c85d846f803\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.369644 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d00a5534-142d-4b51-baf4-3c85d846f803-config-data\") pod \"d00a5534-142d-4b51-baf4-3c85d846f803\" (UID: \"d00a5534-142d-4b51-baf4-3c85d846f803\") " Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.371862 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d00a5534-142d-4b51-baf4-3c85d846f803-logs" (OuterVolumeSpecName: "logs") pod "d00a5534-142d-4b51-baf4-3c85d846f803" (UID: "d00a5534-142d-4b51-baf4-3c85d846f803"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.386749 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d00a5534-142d-4b51-baf4-3c85d846f803-kube-api-access-g74k9" (OuterVolumeSpecName: "kube-api-access-g74k9") pod "d00a5534-142d-4b51-baf4-3c85d846f803" (UID: "d00a5534-142d-4b51-baf4-3c85d846f803"). InnerVolumeSpecName "kube-api-access-g74k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.422581 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d00a5534-142d-4b51-baf4-3c85d846f803-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d00a5534-142d-4b51-baf4-3c85d846f803" (UID: "d00a5534-142d-4b51-baf4-3c85d846f803"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.440594 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d00a5534-142d-4b51-baf4-3c85d846f803-config-data" (OuterVolumeSpecName: "config-data") pod "d00a5534-142d-4b51-baf4-3c85d846f803" (UID: "d00a5534-142d-4b51-baf4-3c85d846f803"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.473958 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g74k9\" (UniqueName: \"kubernetes.io/projected/d00a5534-142d-4b51-baf4-3c85d846f803-kube-api-access-g74k9\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.474004 4992 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d00a5534-142d-4b51-baf4-3c85d846f803-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.474015 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d00a5534-142d-4b51-baf4-3c85d846f803-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.474022 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d00a5534-142d-4b51-baf4-3c85d846f803-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.738197 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eca84ee-5e16-4802-b7d5-88df97d8787b","Type":"ContainerStarted","Data":"5914eee67ce969f2ce76a0d54e5f962a003738b2e8d0d1b9a580fe74caa5ac52"} Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.740371 4992 generic.go:334] "Generic (PLEG): container finished" podID="d00a5534-142d-4b51-baf4-3c85d846f803" containerID="5cac9d95bc82e2d51776d256dac064169102eaaca9a3ae5ace008fb525625030" exitCode=0 Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.740398 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d00a5534-142d-4b51-baf4-3c85d846f803","Type":"ContainerDied","Data":"5cac9d95bc82e2d51776d256dac064169102eaaca9a3ae5ace008fb525625030"} Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.740430 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d00a5534-142d-4b51-baf4-3c85d846f803","Type":"ContainerDied","Data":"57f4fadb4ece04f01331745ac14616812fa4983ac20fb313119e33149dcd4057"} Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.740447 4992 scope.go:117] "RemoveContainer" containerID="5cac9d95bc82e2d51776d256dac064169102eaaca9a3ae5ace008fb525625030" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.740642 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.871622 4992 scope.go:117] "RemoveContainer" containerID="ca5f422ac76dd2e5013898dbb5db9291a9b5967dbaa43da4931c413dc992594b" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.882600 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.898914 4992 scope.go:117] "RemoveContainer" containerID="5cac9d95bc82e2d51776d256dac064169102eaaca9a3ae5ace008fb525625030" Jan 31 09:47:32 crc kubenswrapper[4992]: E0131 09:47:32.899679 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cac9d95bc82e2d51776d256dac064169102eaaca9a3ae5ace008fb525625030\": container with ID starting with 5cac9d95bc82e2d51776d256dac064169102eaaca9a3ae5ace008fb525625030 not found: ID does not exist" containerID="5cac9d95bc82e2d51776d256dac064169102eaaca9a3ae5ace008fb525625030" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.899740 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cac9d95bc82e2d51776d256dac064169102eaaca9a3ae5ace008fb525625030"} err="failed to get container status \"5cac9d95bc82e2d51776d256dac064169102eaaca9a3ae5ace008fb525625030\": rpc error: code = NotFound desc = could not find container \"5cac9d95bc82e2d51776d256dac064169102eaaca9a3ae5ace008fb525625030\": container with ID starting with 5cac9d95bc82e2d51776d256dac064169102eaaca9a3ae5ace008fb525625030 not found: ID does not exist" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.899767 4992 scope.go:117] "RemoveContainer" containerID="ca5f422ac76dd2e5013898dbb5db9291a9b5967dbaa43da4931c413dc992594b" Jan 31 09:47:32 crc kubenswrapper[4992]: E0131 09:47:32.901862 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca5f422ac76dd2e5013898dbb5db9291a9b5967dbaa43da4931c413dc992594b\": container with ID starting with ca5f422ac76dd2e5013898dbb5db9291a9b5967dbaa43da4931c413dc992594b not found: ID does not exist" containerID="ca5f422ac76dd2e5013898dbb5db9291a9b5967dbaa43da4931c413dc992594b" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.901904 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca5f422ac76dd2e5013898dbb5db9291a9b5967dbaa43da4931c413dc992594b"} err="failed to get container status \"ca5f422ac76dd2e5013898dbb5db9291a9b5967dbaa43da4931c413dc992594b\": rpc error: code = NotFound desc = could not find container \"ca5f422ac76dd2e5013898dbb5db9291a9b5967dbaa43da4931c413dc992594b\": container with ID starting with ca5f422ac76dd2e5013898dbb5db9291a9b5967dbaa43da4931c413dc992594b not found: ID does not exist" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.905280 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.922690 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:32 crc kubenswrapper[4992]: E0131 09:47:32.923227 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00a5534-142d-4b51-baf4-3c85d846f803" containerName="nova-api-api" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.923243 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00a5534-142d-4b51-baf4-3c85d846f803" containerName="nova-api-api" Jan 31 09:47:32 crc kubenswrapper[4992]: E0131 09:47:32.923292 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00a5534-142d-4b51-baf4-3c85d846f803" containerName="nova-api-log" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.923300 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00a5534-142d-4b51-baf4-3c85d846f803" containerName="nova-api-log" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.923495 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d00a5534-142d-4b51-baf4-3c85d846f803" containerName="nova-api-api" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.923531 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d00a5534-142d-4b51-baf4-3c85d846f803" containerName="nova-api-log" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.924858 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.927123 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.927282 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.927485 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 31 09:47:32 crc kubenswrapper[4992]: I0131 09:47:32.933001 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.085617 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8g49\" (UniqueName: \"kubernetes.io/projected/d7561038-f719-41eb-96ae-59d27985c8eb-kube-api-access-j8g49\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.085692 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.085814 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.086053 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-config-data\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.086115 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-public-tls-certs\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.086204 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7561038-f719-41eb-96ae-59d27985c8eb-logs\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.186982 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7561038-f719-41eb-96ae-59d27985c8eb-logs\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.187054 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8g49\" (UniqueName: \"kubernetes.io/projected/d7561038-f719-41eb-96ae-59d27985c8eb-kube-api-access-j8g49\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.187080 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.187114 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.187809 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7561038-f719-41eb-96ae-59d27985c8eb-logs\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.188025 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-config-data\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.188144 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-public-tls-certs\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.192217 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-public-tls-certs\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.192458 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-config-data\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.195697 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d00a5534-142d-4b51-baf4-3c85d846f803" path="/var/lib/kubelet/pods/d00a5534-142d-4b51-baf4-3c85d846f803/volumes" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.198974 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.204672 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.215477 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8g49\" (UniqueName: \"kubernetes.io/projected/d7561038-f719-41eb-96ae-59d27985c8eb-kube-api-access-j8g49\") pod \"nova-api-0\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.244334 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.700380 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.757397 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eca84ee-5e16-4802-b7d5-88df97d8787b","Type":"ContainerStarted","Data":"b3ce0be0f6382442cb59b5c72fb4344a0e99b50335ff474fed060208118d214b"} Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.759358 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7561038-f719-41eb-96ae-59d27985c8eb","Type":"ContainerStarted","Data":"ee3ca92e7ef4699a0ef0d8e8921c1342185e210815dc1623f4a295db8c573746"} Jan 31 09:47:33 crc kubenswrapper[4992]: I0131 09:47:33.989399 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:34 crc kubenswrapper[4992]: I0131 09:47:34.009366 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:34 crc kubenswrapper[4992]: I0131 09:47:34.769651 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7561038-f719-41eb-96ae-59d27985c8eb","Type":"ContainerStarted","Data":"2eb7db7fc0b66d889eb67880a7883881ac3261d44a78478ddad86ae009c6e349"} Jan 31 09:47:34 crc kubenswrapper[4992]: I0131 09:47:34.769978 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7561038-f719-41eb-96ae-59d27985c8eb","Type":"ContainerStarted","Data":"22ef6981d5b8d249036ba247f24ba8f6a0affcce29e9cf3533cf793f121a3516"} Jan 31 09:47:34 crc kubenswrapper[4992]: I0131 09:47:34.784406 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 31 09:47:34 crc kubenswrapper[4992]: I0131 09:47:34.792037 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.792004975 podStartE2EDuration="2.792004975s" podCreationTimestamp="2026-01-31 09:47:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:47:34.787728445 +0000 UTC m=+1350.759120442" watchObservedRunningTime="2026-01-31 09:47:34.792004975 +0000 UTC m=+1350.763396962" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.043024 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-zffb4"] Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.044387 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.047670 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.047922 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.053136 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zffb4"] Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.124633 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-scripts\") pod \"nova-cell1-cell-mapping-zffb4\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.124694 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnzks\" (UniqueName: \"kubernetes.io/projected/2f340cf4-078d-4c29-819e-0e29fc2ff63b-kube-api-access-dnzks\") pod \"nova-cell1-cell-mapping-zffb4\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.124849 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zffb4\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.124906 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-config-data\") pod \"nova-cell1-cell-mapping-zffb4\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.226484 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-config-data\") pod \"nova-cell1-cell-mapping-zffb4\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.226795 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-scripts\") pod \"nova-cell1-cell-mapping-zffb4\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.226966 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnzks\" (UniqueName: \"kubernetes.io/projected/2f340cf4-078d-4c29-819e-0e29fc2ff63b-kube-api-access-dnzks\") pod \"nova-cell1-cell-mapping-zffb4\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.227213 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zffb4\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.232020 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zffb4\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.232096 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-scripts\") pod \"nova-cell1-cell-mapping-zffb4\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.232464 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-config-data\") pod \"nova-cell1-cell-mapping-zffb4\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.249087 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnzks\" (UniqueName: \"kubernetes.io/projected/2f340cf4-078d-4c29-819e-0e29fc2ff63b-kube-api-access-dnzks\") pod \"nova-cell1-cell-mapping-zffb4\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.480228 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.782799 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eca84ee-5e16-4802-b7d5-88df97d8787b","Type":"ContainerStarted","Data":"7e8aa85ebc8c634b4478299a2b4601bb2481a63865048dc5b4a72a7cdb93ceab"} Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.920749 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8273758089999999 podStartE2EDuration="5.92073264s" podCreationTimestamp="2026-01-31 09:47:30 +0000 UTC" firstStartedPulling="2026-01-31 09:47:30.899985327 +0000 UTC m=+1346.871377304" lastFinishedPulling="2026-01-31 09:47:34.993342148 +0000 UTC m=+1350.964734135" observedRunningTime="2026-01-31 09:47:35.815084963 +0000 UTC m=+1351.786476970" watchObservedRunningTime="2026-01-31 09:47:35.92073264 +0000 UTC m=+1351.892124627" Jan 31 09:47:35 crc kubenswrapper[4992]: I0131 09:47:35.931006 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zffb4"] Jan 31 09:47:36 crc kubenswrapper[4992]: I0131 09:47:36.158477 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:47:36 crc kubenswrapper[4992]: I0131 09:47:36.294388 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-td2ws"] Jan 31 09:47:36 crc kubenswrapper[4992]: I0131 09:47:36.294688 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" podUID="ad622716-6050-414e-b1c3-5a84c0881d16" containerName="dnsmasq-dns" containerID="cri-o://d8073dcb0ad1a519a5f0fcc8b3b2d7af060581856e49c38ea24df8525e40ad5e" gracePeriod=10 Jan 31 09:47:36 crc kubenswrapper[4992]: I0131 09:47:36.792918 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zffb4" event={"ID":"2f340cf4-078d-4c29-819e-0e29fc2ff63b","Type":"ContainerStarted","Data":"d179f0c605b0e131a357af624807c1d9dd307cc6a6a12afe3fcb80c4d1e2deec"} Jan 31 09:47:36 crc kubenswrapper[4992]: I0131 09:47:36.793045 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zffb4" event={"ID":"2f340cf4-078d-4c29-819e-0e29fc2ff63b","Type":"ContainerStarted","Data":"1db2de40c2e7341cd5a60b3c882e9120b79fab243da6371f89f144e4ad7c6f36"} Jan 31 09:47:36 crc kubenswrapper[4992]: I0131 09:47:36.795718 4992 generic.go:334] "Generic (PLEG): container finished" podID="ad622716-6050-414e-b1c3-5a84c0881d16" containerID="d8073dcb0ad1a519a5f0fcc8b3b2d7af060581856e49c38ea24df8525e40ad5e" exitCode=0 Jan 31 09:47:36 crc kubenswrapper[4992]: I0131 09:47:36.796574 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" event={"ID":"ad622716-6050-414e-b1c3-5a84c0881d16","Type":"ContainerDied","Data":"d8073dcb0ad1a519a5f0fcc8b3b2d7af060581856e49c38ea24df8525e40ad5e"} Jan 31 09:47:36 crc kubenswrapper[4992]: I0131 09:47:36.796667 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 31 09:47:36 crc kubenswrapper[4992]: I0131 09:47:36.816152 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-zffb4" podStartSLOduration=1.816125271 podStartE2EDuration="1.816125271s" podCreationTimestamp="2026-01-31 09:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:47:36.811813169 +0000 UTC m=+1352.783205176" watchObservedRunningTime="2026-01-31 09:47:36.816125271 +0000 UTC m=+1352.787517268" Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.339267 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.473109 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89xb2\" (UniqueName: \"kubernetes.io/projected/ad622716-6050-414e-b1c3-5a84c0881d16-kube-api-access-89xb2\") pod \"ad622716-6050-414e-b1c3-5a84c0881d16\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.473154 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-config\") pod \"ad622716-6050-414e-b1c3-5a84c0881d16\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.473243 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-dns-svc\") pod \"ad622716-6050-414e-b1c3-5a84c0881d16\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.473291 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-ovsdbserver-nb\") pod \"ad622716-6050-414e-b1c3-5a84c0881d16\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.473354 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-ovsdbserver-sb\") pod \"ad622716-6050-414e-b1c3-5a84c0881d16\" (UID: \"ad622716-6050-414e-b1c3-5a84c0881d16\") " Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.488573 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad622716-6050-414e-b1c3-5a84c0881d16-kube-api-access-89xb2" (OuterVolumeSpecName: "kube-api-access-89xb2") pod "ad622716-6050-414e-b1c3-5a84c0881d16" (UID: "ad622716-6050-414e-b1c3-5a84c0881d16"). InnerVolumeSpecName "kube-api-access-89xb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.519979 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ad622716-6050-414e-b1c3-5a84c0881d16" (UID: "ad622716-6050-414e-b1c3-5a84c0881d16"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.523819 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad622716-6050-414e-b1c3-5a84c0881d16" (UID: "ad622716-6050-414e-b1c3-5a84c0881d16"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.533353 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ad622716-6050-414e-b1c3-5a84c0881d16" (UID: "ad622716-6050-414e-b1c3-5a84c0881d16"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.538459 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-config" (OuterVolumeSpecName: "config") pod "ad622716-6050-414e-b1c3-5a84c0881d16" (UID: "ad622716-6050-414e-b1c3-5a84c0881d16"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.575840 4992 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.575889 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.575905 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.575917 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89xb2\" (UniqueName: \"kubernetes.io/projected/ad622716-6050-414e-b1c3-5a84c0881d16-kube-api-access-89xb2\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.575930 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad622716-6050-414e-b1c3-5a84c0881d16-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.806081 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.811606 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-td2ws" event={"ID":"ad622716-6050-414e-b1c3-5a84c0881d16","Type":"ContainerDied","Data":"1f28a2416838130bf6466d0e33ce0fde48c6ee329b6c92bb5251cf451cccfe82"} Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.811692 4992 scope.go:117] "RemoveContainer" containerID="d8073dcb0ad1a519a5f0fcc8b3b2d7af060581856e49c38ea24df8525e40ad5e" Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.841806 4992 scope.go:117] "RemoveContainer" containerID="8051ce022b6ddcf48880aa23758db55c937862f3d55aed824388d8195cf800f6" Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.853535 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-td2ws"] Jan 31 09:47:37 crc kubenswrapper[4992]: I0131 09:47:37.867613 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-td2ws"] Jan 31 09:47:39 crc kubenswrapper[4992]: I0131 09:47:39.192458 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad622716-6050-414e-b1c3-5a84c0881d16" path="/var/lib/kubelet/pods/ad622716-6050-414e-b1c3-5a84c0881d16/volumes" Jan 31 09:47:41 crc kubenswrapper[4992]: I0131 09:47:41.861851 4992 generic.go:334] "Generic (PLEG): container finished" podID="2f340cf4-078d-4c29-819e-0e29fc2ff63b" containerID="d179f0c605b0e131a357af624807c1d9dd307cc6a6a12afe3fcb80c4d1e2deec" exitCode=0 Jan 31 09:47:41 crc kubenswrapper[4992]: I0131 09:47:41.861942 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zffb4" event={"ID":"2f340cf4-078d-4c29-819e-0e29fc2ff63b","Type":"ContainerDied","Data":"d179f0c605b0e131a357af624807c1d9dd307cc6a6a12afe3fcb80c4d1e2deec"} Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.244537 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.244859 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.247970 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.377216 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-config-data\") pod \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.377352 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-combined-ca-bundle\") pod \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.377489 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnzks\" (UniqueName: \"kubernetes.io/projected/2f340cf4-078d-4c29-819e-0e29fc2ff63b-kube-api-access-dnzks\") pod \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.377513 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-scripts\") pod \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\" (UID: \"2f340cf4-078d-4c29-819e-0e29fc2ff63b\") " Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.382484 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f340cf4-078d-4c29-819e-0e29fc2ff63b-kube-api-access-dnzks" (OuterVolumeSpecName: "kube-api-access-dnzks") pod "2f340cf4-078d-4c29-819e-0e29fc2ff63b" (UID: "2f340cf4-078d-4c29-819e-0e29fc2ff63b"). InnerVolumeSpecName "kube-api-access-dnzks". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.400788 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-scripts" (OuterVolumeSpecName: "scripts") pod "2f340cf4-078d-4c29-819e-0e29fc2ff63b" (UID: "2f340cf4-078d-4c29-819e-0e29fc2ff63b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.403303 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f340cf4-078d-4c29-819e-0e29fc2ff63b" (UID: "2f340cf4-078d-4c29-819e-0e29fc2ff63b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.413296 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-config-data" (OuterVolumeSpecName: "config-data") pod "2f340cf4-078d-4c29-819e-0e29fc2ff63b" (UID: "2f340cf4-078d-4c29-819e-0e29fc2ff63b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.479408 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.479463 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnzks\" (UniqueName: \"kubernetes.io/projected/2f340cf4-078d-4c29-819e-0e29fc2ff63b-kube-api-access-dnzks\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.479477 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.479487 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f340cf4-078d-4c29-819e-0e29fc2ff63b-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.879613 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zffb4" event={"ID":"2f340cf4-078d-4c29-819e-0e29fc2ff63b","Type":"ContainerDied","Data":"1db2de40c2e7341cd5a60b3c882e9120b79fab243da6371f89f144e4ad7c6f36"} Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.879925 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1db2de40c2e7341cd5a60b3c882e9120b79fab243da6371f89f144e4ad7c6f36" Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.879673 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zffb4" Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.975826 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.976266 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d7561038-f719-41eb-96ae-59d27985c8eb" containerName="nova-api-log" containerID="cri-o://22ef6981d5b8d249036ba247f24ba8f6a0affcce29e9cf3533cf793f121a3516" gracePeriod=30 Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.976720 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d7561038-f719-41eb-96ae-59d27985c8eb" containerName="nova-api-api" containerID="cri-o://2eb7db7fc0b66d889eb67880a7883881ac3261d44a78478ddad86ae009c6e349" gracePeriod=30 Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.986634 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d7561038-f719-41eb-96ae-59d27985c8eb" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.191:8774/\": EOF" Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.988116 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.988293 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d9a803da-9736-4555-8851-5a87c3421592" containerName="nova-scheduler-scheduler" containerID="cri-o://5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5" gracePeriod=30 Jan 31 09:47:43 crc kubenswrapper[4992]: I0131 09:47:43.988968 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d7561038-f719-41eb-96ae-59d27985c8eb" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.191:8774/\": EOF" Jan 31 09:47:44 crc kubenswrapper[4992]: I0131 09:47:44.028676 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:47:44 crc kubenswrapper[4992]: I0131 09:47:44.028908 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d91af037-5b1b-4543-810e-06667d38a865" containerName="nova-metadata-log" containerID="cri-o://0f93012b081cbd9e15257cfe62962baadc5bb987e0540c1599416a8988394541" gracePeriod=30 Jan 31 09:47:44 crc kubenswrapper[4992]: I0131 09:47:44.028958 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d91af037-5b1b-4543-810e-06667d38a865" containerName="nova-metadata-metadata" containerID="cri-o://1367720b0b5897db3d75f03fc07bb07dd938e476f5f83e76ddaa1f0c9dc87a55" gracePeriod=30 Jan 31 09:47:44 crc kubenswrapper[4992]: E0131 09:47:44.861007 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 31 09:47:44 crc kubenswrapper[4992]: E0131 09:47:44.862681 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 31 09:47:44 crc kubenswrapper[4992]: E0131 09:47:44.864178 4992 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 31 09:47:44 crc kubenswrapper[4992]: E0131 09:47:44.864217 4992 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d9a803da-9736-4555-8851-5a87c3421592" containerName="nova-scheduler-scheduler" Jan 31 09:47:44 crc kubenswrapper[4992]: I0131 09:47:44.889605 4992 generic.go:334] "Generic (PLEG): container finished" podID="d7561038-f719-41eb-96ae-59d27985c8eb" containerID="22ef6981d5b8d249036ba247f24ba8f6a0affcce29e9cf3533cf793f121a3516" exitCode=143 Jan 31 09:47:44 crc kubenswrapper[4992]: I0131 09:47:44.889667 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7561038-f719-41eb-96ae-59d27985c8eb","Type":"ContainerDied","Data":"22ef6981d5b8d249036ba247f24ba8f6a0affcce29e9cf3533cf793f121a3516"} Jan 31 09:47:44 crc kubenswrapper[4992]: I0131 09:47:44.891590 4992 generic.go:334] "Generic (PLEG): container finished" podID="d91af037-5b1b-4543-810e-06667d38a865" containerID="0f93012b081cbd9e15257cfe62962baadc5bb987e0540c1599416a8988394541" exitCode=143 Jan 31 09:47:44 crc kubenswrapper[4992]: I0131 09:47:44.891618 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d91af037-5b1b-4543-810e-06667d38a865","Type":"ContainerDied","Data":"0f93012b081cbd9e15257cfe62962baadc5bb987e0540c1599416a8988394541"} Jan 31 09:47:45 crc kubenswrapper[4992]: I0131 09:47:45.301265 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:47:45 crc kubenswrapper[4992]: I0131 09:47:45.301334 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.240400 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d91af037-5b1b-4543-810e-06667d38a865" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": read tcp 10.217.0.2:43322->10.217.0.185:8775: read: connection reset by peer" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.240401 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d91af037-5b1b-4543-810e-06667d38a865" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": read tcp 10.217.0.2:43316->10.217.0.185:8775: read: connection reset by peer" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.757207 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.864793 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-nova-metadata-tls-certs\") pod \"d91af037-5b1b-4543-810e-06667d38a865\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.864935 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-config-data\") pod \"d91af037-5b1b-4543-810e-06667d38a865\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.865079 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4wj8\" (UniqueName: \"kubernetes.io/projected/d91af037-5b1b-4543-810e-06667d38a865-kube-api-access-f4wj8\") pod \"d91af037-5b1b-4543-810e-06667d38a865\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.865161 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d91af037-5b1b-4543-810e-06667d38a865-logs\") pod \"d91af037-5b1b-4543-810e-06667d38a865\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.865226 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-combined-ca-bundle\") pod \"d91af037-5b1b-4543-810e-06667d38a865\" (UID: \"d91af037-5b1b-4543-810e-06667d38a865\") " Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.865601 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d91af037-5b1b-4543-810e-06667d38a865-logs" (OuterVolumeSpecName: "logs") pod "d91af037-5b1b-4543-810e-06667d38a865" (UID: "d91af037-5b1b-4543-810e-06667d38a865"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.866030 4992 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d91af037-5b1b-4543-810e-06667d38a865-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.882181 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d91af037-5b1b-4543-810e-06667d38a865-kube-api-access-f4wj8" (OuterVolumeSpecName: "kube-api-access-f4wj8") pod "d91af037-5b1b-4543-810e-06667d38a865" (UID: "d91af037-5b1b-4543-810e-06667d38a865"). InnerVolumeSpecName "kube-api-access-f4wj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.895614 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d91af037-5b1b-4543-810e-06667d38a865" (UID: "d91af037-5b1b-4543-810e-06667d38a865"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.908600 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-config-data" (OuterVolumeSpecName: "config-data") pod "d91af037-5b1b-4543-810e-06667d38a865" (UID: "d91af037-5b1b-4543-810e-06667d38a865"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.920190 4992 generic.go:334] "Generic (PLEG): container finished" podID="d91af037-5b1b-4543-810e-06667d38a865" containerID="1367720b0b5897db3d75f03fc07bb07dd938e476f5f83e76ddaa1f0c9dc87a55" exitCode=0 Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.920247 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d91af037-5b1b-4543-810e-06667d38a865","Type":"ContainerDied","Data":"1367720b0b5897db3d75f03fc07bb07dd938e476f5f83e76ddaa1f0c9dc87a55"} Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.920286 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d91af037-5b1b-4543-810e-06667d38a865","Type":"ContainerDied","Data":"26e4672c696ec6e1be2d536bf7979cc84cbcab03747671c1da8736f2a6369a27"} Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.920336 4992 scope.go:117] "RemoveContainer" containerID="1367720b0b5897db3d75f03fc07bb07dd938e476f5f83e76ddaa1f0c9dc87a55" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.920889 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.949376 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d91af037-5b1b-4543-810e-06667d38a865" (UID: "d91af037-5b1b-4543-810e-06667d38a865"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.967857 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.968104 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4wj8\" (UniqueName: \"kubernetes.io/projected/d91af037-5b1b-4543-810e-06667d38a865-kube-api-access-f4wj8\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.968163 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.968272 4992 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d91af037-5b1b-4543-810e-06667d38a865-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.978763 4992 scope.go:117] "RemoveContainer" containerID="0f93012b081cbd9e15257cfe62962baadc5bb987e0540c1599416a8988394541" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.997215 4992 scope.go:117] "RemoveContainer" containerID="1367720b0b5897db3d75f03fc07bb07dd938e476f5f83e76ddaa1f0c9dc87a55" Jan 31 09:47:47 crc kubenswrapper[4992]: E0131 09:47:47.998243 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1367720b0b5897db3d75f03fc07bb07dd938e476f5f83e76ddaa1f0c9dc87a55\": container with ID starting with 1367720b0b5897db3d75f03fc07bb07dd938e476f5f83e76ddaa1f0c9dc87a55 not found: ID does not exist" containerID="1367720b0b5897db3d75f03fc07bb07dd938e476f5f83e76ddaa1f0c9dc87a55" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.998279 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1367720b0b5897db3d75f03fc07bb07dd938e476f5f83e76ddaa1f0c9dc87a55"} err="failed to get container status \"1367720b0b5897db3d75f03fc07bb07dd938e476f5f83e76ddaa1f0c9dc87a55\": rpc error: code = NotFound desc = could not find container \"1367720b0b5897db3d75f03fc07bb07dd938e476f5f83e76ddaa1f0c9dc87a55\": container with ID starting with 1367720b0b5897db3d75f03fc07bb07dd938e476f5f83e76ddaa1f0c9dc87a55 not found: ID does not exist" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.998299 4992 scope.go:117] "RemoveContainer" containerID="0f93012b081cbd9e15257cfe62962baadc5bb987e0540c1599416a8988394541" Jan 31 09:47:47 crc kubenswrapper[4992]: E0131 09:47:47.998528 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f93012b081cbd9e15257cfe62962baadc5bb987e0540c1599416a8988394541\": container with ID starting with 0f93012b081cbd9e15257cfe62962baadc5bb987e0540c1599416a8988394541 not found: ID does not exist" containerID="0f93012b081cbd9e15257cfe62962baadc5bb987e0540c1599416a8988394541" Jan 31 09:47:47 crc kubenswrapper[4992]: I0131 09:47:47.998547 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f93012b081cbd9e15257cfe62962baadc5bb987e0540c1599416a8988394541"} err="failed to get container status \"0f93012b081cbd9e15257cfe62962baadc5bb987e0540c1599416a8988394541\": rpc error: code = NotFound desc = could not find container \"0f93012b081cbd9e15257cfe62962baadc5bb987e0540c1599416a8988394541\": container with ID starting with 0f93012b081cbd9e15257cfe62962baadc5bb987e0540c1599416a8988394541 not found: ID does not exist" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.278481 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.289810 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.298301 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:47:48 crc kubenswrapper[4992]: E0131 09:47:48.299182 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad622716-6050-414e-b1c3-5a84c0881d16" containerName="init" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.299296 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad622716-6050-414e-b1c3-5a84c0881d16" containerName="init" Jan 31 09:47:48 crc kubenswrapper[4992]: E0131 09:47:48.299409 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d91af037-5b1b-4543-810e-06667d38a865" containerName="nova-metadata-log" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.299541 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d91af037-5b1b-4543-810e-06667d38a865" containerName="nova-metadata-log" Jan 31 09:47:48 crc kubenswrapper[4992]: E0131 09:47:48.299610 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad622716-6050-414e-b1c3-5a84c0881d16" containerName="dnsmasq-dns" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.299669 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad622716-6050-414e-b1c3-5a84c0881d16" containerName="dnsmasq-dns" Jan 31 09:47:48 crc kubenswrapper[4992]: E0131 09:47:48.299731 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d91af037-5b1b-4543-810e-06667d38a865" containerName="nova-metadata-metadata" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.299796 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d91af037-5b1b-4543-810e-06667d38a865" containerName="nova-metadata-metadata" Jan 31 09:47:48 crc kubenswrapper[4992]: E0131 09:47:48.299858 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f340cf4-078d-4c29-819e-0e29fc2ff63b" containerName="nova-manage" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.299913 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f340cf4-078d-4c29-819e-0e29fc2ff63b" containerName="nova-manage" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.300108 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad622716-6050-414e-b1c3-5a84c0881d16" containerName="dnsmasq-dns" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.300172 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d91af037-5b1b-4543-810e-06667d38a865" containerName="nova-metadata-log" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.300224 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d91af037-5b1b-4543-810e-06667d38a865" containerName="nova-metadata-metadata" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.300291 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f340cf4-078d-4c29-819e-0e29fc2ff63b" containerName="nova-manage" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.301222 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.305772 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.307350 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.307831 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.373931 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skl9h\" (UniqueName: \"kubernetes.io/projected/cb802601-7090-4e10-a3e5-3fc64959cbe9-kube-api-access-skl9h\") pod \"nova-metadata-0\" (UID: \"cb802601-7090-4e10-a3e5-3fc64959cbe9\") " pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.373992 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb802601-7090-4e10-a3e5-3fc64959cbe9-config-data\") pod \"nova-metadata-0\" (UID: \"cb802601-7090-4e10-a3e5-3fc64959cbe9\") " pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.374035 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb802601-7090-4e10-a3e5-3fc64959cbe9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cb802601-7090-4e10-a3e5-3fc64959cbe9\") " pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.374061 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb802601-7090-4e10-a3e5-3fc64959cbe9-logs\") pod \"nova-metadata-0\" (UID: \"cb802601-7090-4e10-a3e5-3fc64959cbe9\") " pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.374101 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb802601-7090-4e10-a3e5-3fc64959cbe9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cb802601-7090-4e10-a3e5-3fc64959cbe9\") " pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.476046 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skl9h\" (UniqueName: \"kubernetes.io/projected/cb802601-7090-4e10-a3e5-3fc64959cbe9-kube-api-access-skl9h\") pod \"nova-metadata-0\" (UID: \"cb802601-7090-4e10-a3e5-3fc64959cbe9\") " pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.476365 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb802601-7090-4e10-a3e5-3fc64959cbe9-config-data\") pod \"nova-metadata-0\" (UID: \"cb802601-7090-4e10-a3e5-3fc64959cbe9\") " pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.476437 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb802601-7090-4e10-a3e5-3fc64959cbe9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cb802601-7090-4e10-a3e5-3fc64959cbe9\") " pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.476480 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb802601-7090-4e10-a3e5-3fc64959cbe9-logs\") pod \"nova-metadata-0\" (UID: \"cb802601-7090-4e10-a3e5-3fc64959cbe9\") " pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.476525 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb802601-7090-4e10-a3e5-3fc64959cbe9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cb802601-7090-4e10-a3e5-3fc64959cbe9\") " pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.478561 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb802601-7090-4e10-a3e5-3fc64959cbe9-logs\") pod \"nova-metadata-0\" (UID: \"cb802601-7090-4e10-a3e5-3fc64959cbe9\") " pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.480301 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb802601-7090-4e10-a3e5-3fc64959cbe9-config-data\") pod \"nova-metadata-0\" (UID: \"cb802601-7090-4e10-a3e5-3fc64959cbe9\") " pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.480716 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb802601-7090-4e10-a3e5-3fc64959cbe9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cb802601-7090-4e10-a3e5-3fc64959cbe9\") " pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.483303 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb802601-7090-4e10-a3e5-3fc64959cbe9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cb802601-7090-4e10-a3e5-3fc64959cbe9\") " pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.497160 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skl9h\" (UniqueName: \"kubernetes.io/projected/cb802601-7090-4e10-a3e5-3fc64959cbe9-kube-api-access-skl9h\") pod \"nova-metadata-0\" (UID: \"cb802601-7090-4e10-a3e5-3fc64959cbe9\") " pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.604911 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.627620 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.679339 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a803da-9736-4555-8851-5a87c3421592-config-data\") pod \"d9a803da-9736-4555-8851-5a87c3421592\" (UID: \"d9a803da-9736-4555-8851-5a87c3421592\") " Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.679396 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a803da-9736-4555-8851-5a87c3421592-combined-ca-bundle\") pod \"d9a803da-9736-4555-8851-5a87c3421592\" (UID: \"d9a803da-9736-4555-8851-5a87c3421592\") " Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.679649 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzzgk\" (UniqueName: \"kubernetes.io/projected/d9a803da-9736-4555-8851-5a87c3421592-kube-api-access-bzzgk\") pod \"d9a803da-9736-4555-8851-5a87c3421592\" (UID: \"d9a803da-9736-4555-8851-5a87c3421592\") " Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.684038 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9a803da-9736-4555-8851-5a87c3421592-kube-api-access-bzzgk" (OuterVolumeSpecName: "kube-api-access-bzzgk") pod "d9a803da-9736-4555-8851-5a87c3421592" (UID: "d9a803da-9736-4555-8851-5a87c3421592"). InnerVolumeSpecName "kube-api-access-bzzgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.711963 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a803da-9736-4555-8851-5a87c3421592-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9a803da-9736-4555-8851-5a87c3421592" (UID: "d9a803da-9736-4555-8851-5a87c3421592"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.714854 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a803da-9736-4555-8851-5a87c3421592-config-data" (OuterVolumeSpecName: "config-data") pod "d9a803da-9736-4555-8851-5a87c3421592" (UID: "d9a803da-9736-4555-8851-5a87c3421592"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.782013 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzzgk\" (UniqueName: \"kubernetes.io/projected/d9a803da-9736-4555-8851-5a87c3421592-kube-api-access-bzzgk\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.782046 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9a803da-9736-4555-8851-5a87c3421592-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.782060 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9a803da-9736-4555-8851-5a87c3421592-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.929361 4992 generic.go:334] "Generic (PLEG): container finished" podID="d9a803da-9736-4555-8851-5a87c3421592" containerID="5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5" exitCode=0 Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.929410 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.929450 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9a803da-9736-4555-8851-5a87c3421592","Type":"ContainerDied","Data":"5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5"} Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.929935 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d9a803da-9736-4555-8851-5a87c3421592","Type":"ContainerDied","Data":"877dea6aa3bcc5c0f8ce961c862f25a7658066cc1dd987631d403f2c2d9ebfc1"} Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.929956 4992 scope.go:117] "RemoveContainer" containerID="5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.951573 4992 scope.go:117] "RemoveContainer" containerID="5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5" Jan 31 09:47:48 crc kubenswrapper[4992]: E0131 09:47:48.953004 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5\": container with ID starting with 5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5 not found: ID does not exist" containerID="5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.953047 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5"} err="failed to get container status \"5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5\": rpc error: code = NotFound desc = could not find container \"5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5\": container with ID starting with 5c1c69f02efb55cc3cbd4eb79888e1cd59dfbbde880a3fc642117d8c9638f7f5 not found: ID does not exist" Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.972138 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:47:48 crc kubenswrapper[4992]: I0131 09:47:48.990666 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.001561 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:47:49 crc kubenswrapper[4992]: E0131 09:47:49.002054 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a803da-9736-4555-8851-5a87c3421592" containerName="nova-scheduler-scheduler" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.002088 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a803da-9736-4555-8851-5a87c3421592" containerName="nova-scheduler-scheduler" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.002350 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a803da-9736-4555-8851-5a87c3421592" containerName="nova-scheduler-scheduler" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.003113 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.006852 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.011205 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.086530 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542d01d8-f2f8-4513-857e-cdc828f381f9-config-data\") pod \"nova-scheduler-0\" (UID: \"542d01d8-f2f8-4513-857e-cdc828f381f9\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.086633 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542d01d8-f2f8-4513-857e-cdc828f381f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"542d01d8-f2f8-4513-857e-cdc828f381f9\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.086697 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgr9m\" (UniqueName: \"kubernetes.io/projected/542d01d8-f2f8-4513-857e-cdc828f381f9-kube-api-access-rgr9m\") pod \"nova-scheduler-0\" (UID: \"542d01d8-f2f8-4513-857e-cdc828f381f9\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.096009 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.188718 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542d01d8-f2f8-4513-857e-cdc828f381f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"542d01d8-f2f8-4513-857e-cdc828f381f9\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.188802 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgr9m\" (UniqueName: \"kubernetes.io/projected/542d01d8-f2f8-4513-857e-cdc828f381f9-kube-api-access-rgr9m\") pod \"nova-scheduler-0\" (UID: \"542d01d8-f2f8-4513-857e-cdc828f381f9\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.188918 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542d01d8-f2f8-4513-857e-cdc828f381f9-config-data\") pod \"nova-scheduler-0\" (UID: \"542d01d8-f2f8-4513-857e-cdc828f381f9\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.194064 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d91af037-5b1b-4543-810e-06667d38a865" path="/var/lib/kubelet/pods/d91af037-5b1b-4543-810e-06667d38a865/volumes" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.194798 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9a803da-9736-4555-8851-5a87c3421592" path="/var/lib/kubelet/pods/d9a803da-9736-4555-8851-5a87c3421592/volumes" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.195075 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/542d01d8-f2f8-4513-857e-cdc828f381f9-config-data\") pod \"nova-scheduler-0\" (UID: \"542d01d8-f2f8-4513-857e-cdc828f381f9\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.195259 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542d01d8-f2f8-4513-857e-cdc828f381f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"542d01d8-f2f8-4513-857e-cdc828f381f9\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.206291 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgr9m\" (UniqueName: \"kubernetes.io/projected/542d01d8-f2f8-4513-857e-cdc828f381f9-kube-api-access-rgr9m\") pod \"nova-scheduler-0\" (UID: \"542d01d8-f2f8-4513-857e-cdc828f381f9\") " pod="openstack/nova-scheduler-0" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.322076 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.806964 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 31 09:47:49 crc kubenswrapper[4992]: W0131 09:47:49.811451 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod542d01d8_f2f8_4513_857e_cdc828f381f9.slice/crio-8dedaa85c1155b93d0b9b0fcff111b927eb326b398f6dcb43d0ed754a6b1840f WatchSource:0}: Error finding container 8dedaa85c1155b93d0b9b0fcff111b927eb326b398f6dcb43d0ed754a6b1840f: Status 404 returned error can't find the container with id 8dedaa85c1155b93d0b9b0fcff111b927eb326b398f6dcb43d0ed754a6b1840f Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.913912 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.948117 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cb802601-7090-4e10-a3e5-3fc64959cbe9","Type":"ContainerStarted","Data":"fe6011ebad76d2dfff3ddd925795e43400d7a6434f5f3bae954821739be61154"} Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.948179 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cb802601-7090-4e10-a3e5-3fc64959cbe9","Type":"ContainerStarted","Data":"c3eea936df86849991c717b7f1849c637835fd9e107751b26ef34698e4cc2572"} Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.948213 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cb802601-7090-4e10-a3e5-3fc64959cbe9","Type":"ContainerStarted","Data":"e49df38781b7e5bd213b2c2248d27df7a024d07c5c3947b4617fa528b7a2a64b"} Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.953573 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"542d01d8-f2f8-4513-857e-cdc828f381f9","Type":"ContainerStarted","Data":"8dedaa85c1155b93d0b9b0fcff111b927eb326b398f6dcb43d0ed754a6b1840f"} Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.955898 4992 generic.go:334] "Generic (PLEG): container finished" podID="d7561038-f719-41eb-96ae-59d27985c8eb" containerID="2eb7db7fc0b66d889eb67880a7883881ac3261d44a78478ddad86ae009c6e349" exitCode=0 Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.955949 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7561038-f719-41eb-96ae-59d27985c8eb","Type":"ContainerDied","Data":"2eb7db7fc0b66d889eb67880a7883881ac3261d44a78478ddad86ae009c6e349"} Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.955980 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d7561038-f719-41eb-96ae-59d27985c8eb","Type":"ContainerDied","Data":"ee3ca92e7ef4699a0ef0d8e8921c1342185e210815dc1623f4a295db8c573746"} Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.956000 4992 scope.go:117] "RemoveContainer" containerID="2eb7db7fc0b66d889eb67880a7883881ac3261d44a78478ddad86ae009c6e349" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.956100 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.978073 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.9780508989999999 podStartE2EDuration="1.978050899s" podCreationTimestamp="2026-01-31 09:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:47:49.964084819 +0000 UTC m=+1365.935476826" watchObservedRunningTime="2026-01-31 09:47:49.978050899 +0000 UTC m=+1365.949442886" Jan 31 09:47:49 crc kubenswrapper[4992]: I0131 09:47:49.997675 4992 scope.go:117] "RemoveContainer" containerID="22ef6981d5b8d249036ba247f24ba8f6a0affcce29e9cf3533cf793f121a3516" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.006603 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-combined-ca-bundle\") pod \"d7561038-f719-41eb-96ae-59d27985c8eb\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.006644 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8g49\" (UniqueName: \"kubernetes.io/projected/d7561038-f719-41eb-96ae-59d27985c8eb-kube-api-access-j8g49\") pod \"d7561038-f719-41eb-96ae-59d27985c8eb\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.006733 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-config-data\") pod \"d7561038-f719-41eb-96ae-59d27985c8eb\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.006811 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-internal-tls-certs\") pod \"d7561038-f719-41eb-96ae-59d27985c8eb\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.006837 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7561038-f719-41eb-96ae-59d27985c8eb-logs\") pod \"d7561038-f719-41eb-96ae-59d27985c8eb\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.006867 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-public-tls-certs\") pod \"d7561038-f719-41eb-96ae-59d27985c8eb\" (UID: \"d7561038-f719-41eb-96ae-59d27985c8eb\") " Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.009010 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7561038-f719-41eb-96ae-59d27985c8eb-logs" (OuterVolumeSpecName: "logs") pod "d7561038-f719-41eb-96ae-59d27985c8eb" (UID: "d7561038-f719-41eb-96ae-59d27985c8eb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.012354 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7561038-f719-41eb-96ae-59d27985c8eb-kube-api-access-j8g49" (OuterVolumeSpecName: "kube-api-access-j8g49") pod "d7561038-f719-41eb-96ae-59d27985c8eb" (UID: "d7561038-f719-41eb-96ae-59d27985c8eb"). InnerVolumeSpecName "kube-api-access-j8g49". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.023260 4992 scope.go:117] "RemoveContainer" containerID="2eb7db7fc0b66d889eb67880a7883881ac3261d44a78478ddad86ae009c6e349" Jan 31 09:47:50 crc kubenswrapper[4992]: E0131 09:47:50.023674 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eb7db7fc0b66d889eb67880a7883881ac3261d44a78478ddad86ae009c6e349\": container with ID starting with 2eb7db7fc0b66d889eb67880a7883881ac3261d44a78478ddad86ae009c6e349 not found: ID does not exist" containerID="2eb7db7fc0b66d889eb67880a7883881ac3261d44a78478ddad86ae009c6e349" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.023739 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb7db7fc0b66d889eb67880a7883881ac3261d44a78478ddad86ae009c6e349"} err="failed to get container status \"2eb7db7fc0b66d889eb67880a7883881ac3261d44a78478ddad86ae009c6e349\": rpc error: code = NotFound desc = could not find container \"2eb7db7fc0b66d889eb67880a7883881ac3261d44a78478ddad86ae009c6e349\": container with ID starting with 2eb7db7fc0b66d889eb67880a7883881ac3261d44a78478ddad86ae009c6e349 not found: ID does not exist" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.023769 4992 scope.go:117] "RemoveContainer" containerID="22ef6981d5b8d249036ba247f24ba8f6a0affcce29e9cf3533cf793f121a3516" Jan 31 09:47:50 crc kubenswrapper[4992]: E0131 09:47:50.024109 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22ef6981d5b8d249036ba247f24ba8f6a0affcce29e9cf3533cf793f121a3516\": container with ID starting with 22ef6981d5b8d249036ba247f24ba8f6a0affcce29e9cf3533cf793f121a3516 not found: ID does not exist" containerID="22ef6981d5b8d249036ba247f24ba8f6a0affcce29e9cf3533cf793f121a3516" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.024131 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ef6981d5b8d249036ba247f24ba8f6a0affcce29e9cf3533cf793f121a3516"} err="failed to get container status \"22ef6981d5b8d249036ba247f24ba8f6a0affcce29e9cf3533cf793f121a3516\": rpc error: code = NotFound desc = could not find container \"22ef6981d5b8d249036ba247f24ba8f6a0affcce29e9cf3533cf793f121a3516\": container with ID starting with 22ef6981d5b8d249036ba247f24ba8f6a0affcce29e9cf3533cf793f121a3516 not found: ID does not exist" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.036602 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-config-data" (OuterVolumeSpecName: "config-data") pod "d7561038-f719-41eb-96ae-59d27985c8eb" (UID: "d7561038-f719-41eb-96ae-59d27985c8eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.045119 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7561038-f719-41eb-96ae-59d27985c8eb" (UID: "d7561038-f719-41eb-96ae-59d27985c8eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.059904 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d7561038-f719-41eb-96ae-59d27985c8eb" (UID: "d7561038-f719-41eb-96ae-59d27985c8eb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.061393 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d7561038-f719-41eb-96ae-59d27985c8eb" (UID: "d7561038-f719-41eb-96ae-59d27985c8eb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.108725 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.108998 4992 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.109088 4992 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7561038-f719-41eb-96ae-59d27985c8eb-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.109203 4992 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.109290 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7561038-f719-41eb-96ae-59d27985c8eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.109368 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8g49\" (UniqueName: \"kubernetes.io/projected/d7561038-f719-41eb-96ae-59d27985c8eb-kube-api-access-j8g49\") on node \"crc\" DevicePath \"\"" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.300439 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.318897 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.337927 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:50 crc kubenswrapper[4992]: E0131 09:47:50.338393 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7561038-f719-41eb-96ae-59d27985c8eb" containerName="nova-api-log" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.338413 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7561038-f719-41eb-96ae-59d27985c8eb" containerName="nova-api-log" Jan 31 09:47:50 crc kubenswrapper[4992]: E0131 09:47:50.338451 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7561038-f719-41eb-96ae-59d27985c8eb" containerName="nova-api-api" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.338459 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7561038-f719-41eb-96ae-59d27985c8eb" containerName="nova-api-api" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.338710 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7561038-f719-41eb-96ae-59d27985c8eb" containerName="nova-api-api" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.338730 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7561038-f719-41eb-96ae-59d27985c8eb" containerName="nova-api-log" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.339849 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.343767 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.344285 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.344556 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.350715 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.414068 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-config-data\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.414347 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-public-tls-certs\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.416458 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88rms\" (UniqueName: \"kubernetes.io/projected/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-kube-api-access-88rms\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.416724 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.416872 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.416994 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-logs\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.521048 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.521630 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.521995 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-logs\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.528649 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-config-data\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.528844 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-public-tls-certs\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.528894 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88rms\" (UniqueName: \"kubernetes.io/projected/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-kube-api-access-88rms\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.526259 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.526858 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-logs\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.537291 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.541665 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-public-tls-certs\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.547661 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-config-data\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.554622 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88rms\" (UniqueName: \"kubernetes.io/projected/ac11de09-452e-4bbb-b8f6-09ea1beea4e0-kube-api-access-88rms\") pod \"nova-api-0\" (UID: \"ac11de09-452e-4bbb-b8f6-09ea1beea4e0\") " pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.719783 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.964865 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"542d01d8-f2f8-4513-857e-cdc828f381f9","Type":"ContainerStarted","Data":"99f4ddf4da43d4e584a3a7cf4a3a00edcd8c07507cc6001469f6872741ce55f8"} Jan 31 09:47:50 crc kubenswrapper[4992]: I0131 09:47:50.984291 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.984276786 podStartE2EDuration="2.984276786s" podCreationTimestamp="2026-01-31 09:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:47:50.982776893 +0000 UTC m=+1366.954168880" watchObservedRunningTime="2026-01-31 09:47:50.984276786 +0000 UTC m=+1366.955668773" Jan 31 09:47:51 crc kubenswrapper[4992]: W0131 09:47:51.163555 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac11de09_452e_4bbb_b8f6_09ea1beea4e0.slice/crio-3d44a15d93bada2c21de3ae741e41940ecb043c0fbb10d5d468d5690e32ac75e WatchSource:0}: Error finding container 3d44a15d93bada2c21de3ae741e41940ecb043c0fbb10d5d468d5690e32ac75e: Status 404 returned error can't find the container with id 3d44a15d93bada2c21de3ae741e41940ecb043c0fbb10d5d468d5690e32ac75e Jan 31 09:47:51 crc kubenswrapper[4992]: I0131 09:47:51.194214 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7561038-f719-41eb-96ae-59d27985c8eb" path="/var/lib/kubelet/pods/d7561038-f719-41eb-96ae-59d27985c8eb/volumes" Jan 31 09:47:51 crc kubenswrapper[4992]: I0131 09:47:51.195063 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 31 09:47:51 crc kubenswrapper[4992]: I0131 09:47:51.997321 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac11de09-452e-4bbb-b8f6-09ea1beea4e0","Type":"ContainerStarted","Data":"368f5c281d2123d413545e59f08c0bc68d9c471e53f3d43e26160dcee5a064fe"} Jan 31 09:47:51 crc kubenswrapper[4992]: I0131 09:47:51.997675 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac11de09-452e-4bbb-b8f6-09ea1beea4e0","Type":"ContainerStarted","Data":"7a3f8c3ca3fccfd31310fa69b21e8a63b58203441992ff552f7b9ae93c190e68"} Jan 31 09:47:51 crc kubenswrapper[4992]: I0131 09:47:51.997691 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac11de09-452e-4bbb-b8f6-09ea1beea4e0","Type":"ContainerStarted","Data":"3d44a15d93bada2c21de3ae741e41940ecb043c0fbb10d5d468d5690e32ac75e"} Jan 31 09:47:52 crc kubenswrapper[4992]: I0131 09:47:52.029669 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.029646305 podStartE2EDuration="2.029646305s" podCreationTimestamp="2026-01-31 09:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:47:52.02319666 +0000 UTC m=+1367.994588667" watchObservedRunningTime="2026-01-31 09:47:52.029646305 +0000 UTC m=+1368.001038292" Jan 31 09:47:53 crc kubenswrapper[4992]: I0131 09:47:53.627790 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 31 09:47:53 crc kubenswrapper[4992]: I0131 09:47:53.628152 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 31 09:47:54 crc kubenswrapper[4992]: I0131 09:47:54.322765 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 31 09:47:58 crc kubenswrapper[4992]: I0131 09:47:58.628290 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 31 09:47:58 crc kubenswrapper[4992]: I0131 09:47:58.628707 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 31 09:47:59 crc kubenswrapper[4992]: I0131 09:47:59.323220 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 31 09:47:59 crc kubenswrapper[4992]: I0131 09:47:59.366162 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 31 09:47:59 crc kubenswrapper[4992]: I0131 09:47:59.644582 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cb802601-7090-4e10-a3e5-3fc64959cbe9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 31 09:47:59 crc kubenswrapper[4992]: I0131 09:47:59.644591 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cb802601-7090-4e10-a3e5-3fc64959cbe9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 31 09:48:00 crc kubenswrapper[4992]: I0131 09:48:00.121731 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 31 09:48:00 crc kubenswrapper[4992]: I0131 09:48:00.409979 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 31 09:48:00 crc kubenswrapper[4992]: I0131 09:48:00.723134 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 31 09:48:00 crc kubenswrapper[4992]: I0131 09:48:00.724298 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 31 09:48:01 crc kubenswrapper[4992]: I0131 09:48:01.731649 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ac11de09-452e-4bbb-b8f6-09ea1beea4e0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.195:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 31 09:48:01 crc kubenswrapper[4992]: I0131 09:48:01.731666 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ac11de09-452e-4bbb-b8f6-09ea1beea4e0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.195:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 31 09:48:08 crc kubenswrapper[4992]: I0131 09:48:08.635495 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 31 09:48:08 crc kubenswrapper[4992]: I0131 09:48:08.636388 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 31 09:48:08 crc kubenswrapper[4992]: I0131 09:48:08.643804 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 31 09:48:08 crc kubenswrapper[4992]: I0131 09:48:08.645773 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 31 09:48:10 crc kubenswrapper[4992]: I0131 09:48:10.727583 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 31 09:48:10 crc kubenswrapper[4992]: I0131 09:48:10.728052 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 31 09:48:10 crc kubenswrapper[4992]: I0131 09:48:10.728272 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 31 09:48:10 crc kubenswrapper[4992]: I0131 09:48:10.728302 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 31 09:48:10 crc kubenswrapper[4992]: I0131 09:48:10.734697 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 31 09:48:10 crc kubenswrapper[4992]: I0131 09:48:10.735702 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 31 09:48:15 crc kubenswrapper[4992]: I0131 09:48:15.301316 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:48:15 crc kubenswrapper[4992]: I0131 09:48:15.301934 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:48:18 crc kubenswrapper[4992]: I0131 09:48:18.567818 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:48:19 crc kubenswrapper[4992]: I0131 09:48:19.416766 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:48:22 crc kubenswrapper[4992]: I0131 09:48:22.775858 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="71b7a97b-2d62-4b05-84f6-fc720ce9c672" containerName="rabbitmq" containerID="cri-o://cb3f7bf108bf934f980a69df97114683a867fed757511e2d4bb8c17bc98f62be" gracePeriod=604796 Jan 31 09:48:23 crc kubenswrapper[4992]: I0131 09:48:23.603962 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="8005e2e7-ed00-4af1-be65-12638ce3a9f9" containerName="rabbitmq" containerID="cri-o://3160b7bf213a050caccbc202ef1bc48be0ea7daa7c411af5744ac7f8e303beda" gracePeriod=604796 Jan 31 09:48:27 crc kubenswrapper[4992]: I0131 09:48:27.539624 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="71b7a97b-2d62-4b05-84f6-fc720ce9c672" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.97:5671: connect: connection refused" Jan 31 09:48:27 crc kubenswrapper[4992]: I0131 09:48:27.914360 4992 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="8005e2e7-ed00-4af1-be65-12638ce3a9f9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.331726 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.370461 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-server-conf\") pod \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.370593 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-config-data\") pod \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.370774 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8swl\" (UniqueName: \"kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-kube-api-access-s8swl\") pod \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.370903 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-erlang-cookie\") pod \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.370975 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.371093 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-plugins-conf\") pod \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.371161 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-tls\") pod \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.371722 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-plugins\") pod \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.371847 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/71b7a97b-2d62-4b05-84f6-fc720ce9c672-erlang-cookie-secret\") pod \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.372303 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/71b7a97b-2d62-4b05-84f6-fc720ce9c672-pod-info\") pod \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.372407 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-confd\") pod \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\" (UID: \"71b7a97b-2d62-4b05-84f6-fc720ce9c672\") " Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.372936 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "71b7a97b-2d62-4b05-84f6-fc720ce9c672" (UID: "71b7a97b-2d62-4b05-84f6-fc720ce9c672"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.375992 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "71b7a97b-2d62-4b05-84f6-fc720ce9c672" (UID: "71b7a97b-2d62-4b05-84f6-fc720ce9c672"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.379683 4992 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.379851 4992 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.383726 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "71b7a97b-2d62-4b05-84f6-fc720ce9c672" (UID: "71b7a97b-2d62-4b05-84f6-fc720ce9c672"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.397180 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "71b7a97b-2d62-4b05-84f6-fc720ce9c672" (UID: "71b7a97b-2d62-4b05-84f6-fc720ce9c672"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.401884 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b7a97b-2d62-4b05-84f6-fc720ce9c672-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "71b7a97b-2d62-4b05-84f6-fc720ce9c672" (UID: "71b7a97b-2d62-4b05-84f6-fc720ce9c672"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.403681 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-kube-api-access-s8swl" (OuterVolumeSpecName: "kube-api-access-s8swl") pod "71b7a97b-2d62-4b05-84f6-fc720ce9c672" (UID: "71b7a97b-2d62-4b05-84f6-fc720ce9c672"). InnerVolumeSpecName "kube-api-access-s8swl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.407966 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/71b7a97b-2d62-4b05-84f6-fc720ce9c672-pod-info" (OuterVolumeSpecName: "pod-info") pod "71b7a97b-2d62-4b05-84f6-fc720ce9c672" (UID: "71b7a97b-2d62-4b05-84f6-fc720ce9c672"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.415481 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "71b7a97b-2d62-4b05-84f6-fc720ce9c672" (UID: "71b7a97b-2d62-4b05-84f6-fc720ce9c672"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.429700 4992 generic.go:334] "Generic (PLEG): container finished" podID="71b7a97b-2d62-4b05-84f6-fc720ce9c672" containerID="cb3f7bf108bf934f980a69df97114683a867fed757511e2d4bb8c17bc98f62be" exitCode=0 Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.429754 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"71b7a97b-2d62-4b05-84f6-fc720ce9c672","Type":"ContainerDied","Data":"cb3f7bf108bf934f980a69df97114683a867fed757511e2d4bb8c17bc98f62be"} Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.429786 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"71b7a97b-2d62-4b05-84f6-fc720ce9c672","Type":"ContainerDied","Data":"ee64adbfda205b91b125ea43c91390d00e39c2c74180d500c7527cb06988dc53"} Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.429912 4992 scope.go:117] "RemoveContainer" containerID="cb3f7bf108bf934f980a69df97114683a867fed757511e2d4bb8c17bc98f62be" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.430411 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.440149 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-config-data" (OuterVolumeSpecName: "config-data") pod "71b7a97b-2d62-4b05-84f6-fc720ce9c672" (UID: "71b7a97b-2d62-4b05-84f6-fc720ce9c672"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.459025 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-server-conf" (OuterVolumeSpecName: "server-conf") pod "71b7a97b-2d62-4b05-84f6-fc720ce9c672" (UID: "71b7a97b-2d62-4b05-84f6-fc720ce9c672"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.477722 4992 scope.go:117] "RemoveContainer" containerID="d22d9ae6579988e4f2c265a9155d5f7266ad4c61c07fd18ab71ac6a17f9af9aa" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.481937 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.482260 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8swl\" (UniqueName: \"kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-kube-api-access-s8swl\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.482428 4992 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.482861 4992 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.482968 4992 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.483072 4992 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/71b7a97b-2d62-4b05-84f6-fc720ce9c672-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.483168 4992 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/71b7a97b-2d62-4b05-84f6-fc720ce9c672-pod-info\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.483260 4992 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/71b7a97b-2d62-4b05-84f6-fc720ce9c672-server-conf\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.505952 4992 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.506715 4992 scope.go:117] "RemoveContainer" containerID="cb3f7bf108bf934f980a69df97114683a867fed757511e2d4bb8c17bc98f62be" Jan 31 09:48:29 crc kubenswrapper[4992]: E0131 09:48:29.509324 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb3f7bf108bf934f980a69df97114683a867fed757511e2d4bb8c17bc98f62be\": container with ID starting with cb3f7bf108bf934f980a69df97114683a867fed757511e2d4bb8c17bc98f62be not found: ID does not exist" containerID="cb3f7bf108bf934f980a69df97114683a867fed757511e2d4bb8c17bc98f62be" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.509376 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3f7bf108bf934f980a69df97114683a867fed757511e2d4bb8c17bc98f62be"} err="failed to get container status \"cb3f7bf108bf934f980a69df97114683a867fed757511e2d4bb8c17bc98f62be\": rpc error: code = NotFound desc = could not find container \"cb3f7bf108bf934f980a69df97114683a867fed757511e2d4bb8c17bc98f62be\": container with ID starting with cb3f7bf108bf934f980a69df97114683a867fed757511e2d4bb8c17bc98f62be not found: ID does not exist" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.509408 4992 scope.go:117] "RemoveContainer" containerID="d22d9ae6579988e4f2c265a9155d5f7266ad4c61c07fd18ab71ac6a17f9af9aa" Jan 31 09:48:29 crc kubenswrapper[4992]: E0131 09:48:29.509708 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d22d9ae6579988e4f2c265a9155d5f7266ad4c61c07fd18ab71ac6a17f9af9aa\": container with ID starting with d22d9ae6579988e4f2c265a9155d5f7266ad4c61c07fd18ab71ac6a17f9af9aa not found: ID does not exist" containerID="d22d9ae6579988e4f2c265a9155d5f7266ad4c61c07fd18ab71ac6a17f9af9aa" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.509736 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d22d9ae6579988e4f2c265a9155d5f7266ad4c61c07fd18ab71ac6a17f9af9aa"} err="failed to get container status \"d22d9ae6579988e4f2c265a9155d5f7266ad4c61c07fd18ab71ac6a17f9af9aa\": rpc error: code = NotFound desc = could not find container \"d22d9ae6579988e4f2c265a9155d5f7266ad4c61c07fd18ab71ac6a17f9af9aa\": container with ID starting with d22d9ae6579988e4f2c265a9155d5f7266ad4c61c07fd18ab71ac6a17f9af9aa not found: ID does not exist" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.553757 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "71b7a97b-2d62-4b05-84f6-fc720ce9c672" (UID: "71b7a97b-2d62-4b05-84f6-fc720ce9c672"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.584975 4992 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/71b7a97b-2d62-4b05-84f6-fc720ce9c672-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.585020 4992 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.832843 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.841279 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.870870 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:48:29 crc kubenswrapper[4992]: E0131 09:48:29.871274 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b7a97b-2d62-4b05-84f6-fc720ce9c672" containerName="setup-container" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.871295 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b7a97b-2d62-4b05-84f6-fc720ce9c672" containerName="setup-container" Jan 31 09:48:29 crc kubenswrapper[4992]: E0131 09:48:29.871319 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b7a97b-2d62-4b05-84f6-fc720ce9c672" containerName="rabbitmq" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.871326 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b7a97b-2d62-4b05-84f6-fc720ce9c672" containerName="rabbitmq" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.873225 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b7a97b-2d62-4b05-84f6-fc720ce9c672" containerName="rabbitmq" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.876893 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.879646 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.880225 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.882089 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.882361 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.882533 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.882642 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ft75t" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.885052 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.885537 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.993648 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/27279979-e584-4689-893b-6357ed920fef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.993731 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/27279979-e584-4689-893b-6357ed920fef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.993767 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/27279979-e584-4689-893b-6357ed920fef-config-data\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.993805 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/27279979-e584-4689-893b-6357ed920fef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.993859 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.993884 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/27279979-e584-4689-893b-6357ed920fef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.993919 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/27279979-e584-4689-893b-6357ed920fef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.993944 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/27279979-e584-4689-893b-6357ed920fef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.994016 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x2xr\" (UniqueName: \"kubernetes.io/projected/27279979-e584-4689-893b-6357ed920fef-kube-api-access-4x2xr\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.994051 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/27279979-e584-4689-893b-6357ed920fef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:29 crc kubenswrapper[4992]: I0131 09:48:29.994074 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/27279979-e584-4689-893b-6357ed920fef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.096007 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/27279979-e584-4689-893b-6357ed920fef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.096048 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/27279979-e584-4689-893b-6357ed920fef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.096084 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/27279979-e584-4689-893b-6357ed920fef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.096134 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/27279979-e584-4689-893b-6357ed920fef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.096151 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/27279979-e584-4689-893b-6357ed920fef-config-data\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.096180 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/27279979-e584-4689-893b-6357ed920fef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.096204 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.096222 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/27279979-e584-4689-893b-6357ed920fef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.096242 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/27279979-e584-4689-893b-6357ed920fef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.096261 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/27279979-e584-4689-893b-6357ed920fef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.096313 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x2xr\" (UniqueName: \"kubernetes.io/projected/27279979-e584-4689-893b-6357ed920fef-kube-api-access-4x2xr\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.097172 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.098190 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/27279979-e584-4689-893b-6357ed920fef-config-data\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.098294 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/27279979-e584-4689-893b-6357ed920fef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.101634 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/27279979-e584-4689-893b-6357ed920fef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.103038 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/27279979-e584-4689-893b-6357ed920fef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.103616 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/27279979-e584-4689-893b-6357ed920fef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.104098 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/27279979-e584-4689-893b-6357ed920fef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.117280 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/27279979-e584-4689-893b-6357ed920fef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.117401 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/27279979-e584-4689-893b-6357ed920fef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.118181 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x2xr\" (UniqueName: \"kubernetes.io/projected/27279979-e584-4689-893b-6357ed920fef-kube-api-access-4x2xr\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.118836 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/27279979-e584-4689-893b-6357ed920fef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.146437 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"27279979-e584-4689-893b-6357ed920fef\") " pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.212002 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.214335 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.407050 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8005e2e7-ed00-4af1-be65-12638ce3a9f9-pod-info\") pod \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.407476 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8005e2e7-ed00-4af1-be65-12638ce3a9f9-erlang-cookie-secret\") pod \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.407525 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-erlang-cookie\") pod \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.407563 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-plugins\") pod \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.407588 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-tls\") pod \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.407638 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.407709 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-confd\") pod \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.407749 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-server-conf\") pod \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.407768 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-plugins-conf\") pod \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.407805 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcvfl\" (UniqueName: \"kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-kube-api-access-hcvfl\") pod \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.407902 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-config-data\") pod \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\" (UID: \"8005e2e7-ed00-4af1-be65-12638ce3a9f9\") " Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.408257 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8005e2e7-ed00-4af1-be65-12638ce3a9f9" (UID: "8005e2e7-ed00-4af1-be65-12638ce3a9f9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.408472 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8005e2e7-ed00-4af1-be65-12638ce3a9f9" (UID: "8005e2e7-ed00-4af1-be65-12638ce3a9f9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.408820 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8005e2e7-ed00-4af1-be65-12638ce3a9f9" (UID: "8005e2e7-ed00-4af1-be65-12638ce3a9f9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.415734 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-kube-api-access-hcvfl" (OuterVolumeSpecName: "kube-api-access-hcvfl") pod "8005e2e7-ed00-4af1-be65-12638ce3a9f9" (UID: "8005e2e7-ed00-4af1-be65-12638ce3a9f9"). InnerVolumeSpecName "kube-api-access-hcvfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.416391 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8005e2e7-ed00-4af1-be65-12638ce3a9f9" (UID: "8005e2e7-ed00-4af1-be65-12638ce3a9f9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.416831 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8005e2e7-ed00-4af1-be65-12638ce3a9f9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8005e2e7-ed00-4af1-be65-12638ce3a9f9" (UID: "8005e2e7-ed00-4af1-be65-12638ce3a9f9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.418157 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8005e2e7-ed00-4af1-be65-12638ce3a9f9-pod-info" (OuterVolumeSpecName: "pod-info") pod "8005e2e7-ed00-4af1-be65-12638ce3a9f9" (UID: "8005e2e7-ed00-4af1-be65-12638ce3a9f9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.421642 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "8005e2e7-ed00-4af1-be65-12638ce3a9f9" (UID: "8005e2e7-ed00-4af1-be65-12638ce3a9f9"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.442923 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-config-data" (OuterVolumeSpecName: "config-data") pod "8005e2e7-ed00-4af1-be65-12638ce3a9f9" (UID: "8005e2e7-ed00-4af1-be65-12638ce3a9f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.452897 4992 generic.go:334] "Generic (PLEG): container finished" podID="8005e2e7-ed00-4af1-be65-12638ce3a9f9" containerID="3160b7bf213a050caccbc202ef1bc48be0ea7daa7c411af5744ac7f8e303beda" exitCode=0 Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.452962 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.453054 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8005e2e7-ed00-4af1-be65-12638ce3a9f9","Type":"ContainerDied","Data":"3160b7bf213a050caccbc202ef1bc48be0ea7daa7c411af5744ac7f8e303beda"} Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.453092 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8005e2e7-ed00-4af1-be65-12638ce3a9f9","Type":"ContainerDied","Data":"d35359be5ec36ab778e90725b0cac71425cab881f43b8bbbc3a86ca2f43ace88"} Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.453163 4992 scope.go:117] "RemoveContainer" containerID="3160b7bf213a050caccbc202ef1bc48be0ea7daa7c411af5744ac7f8e303beda" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.473117 4992 scope.go:117] "RemoveContainer" containerID="7528d895a9359285fce439f82f52be161eb89351642e8d36a40b0419f286cfdc" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.498289 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-server-conf" (OuterVolumeSpecName: "server-conf") pod "8005e2e7-ed00-4af1-be65-12638ce3a9f9" (UID: "8005e2e7-ed00-4af1-be65-12638ce3a9f9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.501147 4992 scope.go:117] "RemoveContainer" containerID="3160b7bf213a050caccbc202ef1bc48be0ea7daa7c411af5744ac7f8e303beda" Jan 31 09:48:30 crc kubenswrapper[4992]: E0131 09:48:30.501612 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3160b7bf213a050caccbc202ef1bc48be0ea7daa7c411af5744ac7f8e303beda\": container with ID starting with 3160b7bf213a050caccbc202ef1bc48be0ea7daa7c411af5744ac7f8e303beda not found: ID does not exist" containerID="3160b7bf213a050caccbc202ef1bc48be0ea7daa7c411af5744ac7f8e303beda" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.501657 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3160b7bf213a050caccbc202ef1bc48be0ea7daa7c411af5744ac7f8e303beda"} err="failed to get container status \"3160b7bf213a050caccbc202ef1bc48be0ea7daa7c411af5744ac7f8e303beda\": rpc error: code = NotFound desc = could not find container \"3160b7bf213a050caccbc202ef1bc48be0ea7daa7c411af5744ac7f8e303beda\": container with ID starting with 3160b7bf213a050caccbc202ef1bc48be0ea7daa7c411af5744ac7f8e303beda not found: ID does not exist" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.501687 4992 scope.go:117] "RemoveContainer" containerID="7528d895a9359285fce439f82f52be161eb89351642e8d36a40b0419f286cfdc" Jan 31 09:48:30 crc kubenswrapper[4992]: E0131 09:48:30.502136 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7528d895a9359285fce439f82f52be161eb89351642e8d36a40b0419f286cfdc\": container with ID starting with 7528d895a9359285fce439f82f52be161eb89351642e8d36a40b0419f286cfdc not found: ID does not exist" containerID="7528d895a9359285fce439f82f52be161eb89351642e8d36a40b0419f286cfdc" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.502177 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7528d895a9359285fce439f82f52be161eb89351642e8d36a40b0419f286cfdc"} err="failed to get container status \"7528d895a9359285fce439f82f52be161eb89351642e8d36a40b0419f286cfdc\": rpc error: code = NotFound desc = could not find container \"7528d895a9359285fce439f82f52be161eb89351642e8d36a40b0419f286cfdc\": container with ID starting with 7528d895a9359285fce439f82f52be161eb89351642e8d36a40b0419f286cfdc not found: ID does not exist" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.509788 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcvfl\" (UniqueName: \"kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-kube-api-access-hcvfl\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.509819 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.509828 4992 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8005e2e7-ed00-4af1-be65-12638ce3a9f9-pod-info\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.509837 4992 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8005e2e7-ed00-4af1-be65-12638ce3a9f9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.509845 4992 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.509853 4992 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.509861 4992 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.509883 4992 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.509891 4992 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-server-conf\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.509898 4992 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8005e2e7-ed00-4af1-be65-12638ce3a9f9-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.530029 4992 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.539377 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8005e2e7-ed00-4af1-be65-12638ce3a9f9" (UID: "8005e2e7-ed00-4af1-be65-12638ce3a9f9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.611385 4992 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.611409 4992 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8005e2e7-ed00-4af1-be65-12638ce3a9f9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.776360 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.798042 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.806140 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.822563 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:48:30 crc kubenswrapper[4992]: E0131 09:48:30.823200 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8005e2e7-ed00-4af1-be65-12638ce3a9f9" containerName="rabbitmq" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.823232 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="8005e2e7-ed00-4af1-be65-12638ce3a9f9" containerName="rabbitmq" Jan 31 09:48:30 crc kubenswrapper[4992]: E0131 09:48:30.823258 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8005e2e7-ed00-4af1-be65-12638ce3a9f9" containerName="setup-container" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.823268 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="8005e2e7-ed00-4af1-be65-12638ce3a9f9" containerName="setup-container" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.823612 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="8005e2e7-ed00-4af1-be65-12638ce3a9f9" containerName="rabbitmq" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.825064 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.828305 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.828561 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-9l44f" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.828785 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.829115 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.829281 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.829474 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.829597 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 31 09:48:30 crc kubenswrapper[4992]: I0131 09:48:30.841180 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.017558 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8d20de0-f97d-4d8a-a01f-01144400f76c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.017613 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d8d20de0-f97d-4d8a-a01f-01144400f76c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.017652 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d8d20de0-f97d-4d8a-a01f-01144400f76c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.017670 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d8d20de0-f97d-4d8a-a01f-01144400f76c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.017688 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkcbr\" (UniqueName: \"kubernetes.io/projected/d8d20de0-f97d-4d8a-a01f-01144400f76c-kube-api-access-wkcbr\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.017717 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d8d20de0-f97d-4d8a-a01f-01144400f76c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.017755 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d8d20de0-f97d-4d8a-a01f-01144400f76c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.017785 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d8d20de0-f97d-4d8a-a01f-01144400f76c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.017805 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d8d20de0-f97d-4d8a-a01f-01144400f76c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.017828 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.017850 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d8d20de0-f97d-4d8a-a01f-01144400f76c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.119609 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d8d20de0-f97d-4d8a-a01f-01144400f76c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.119684 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d8d20de0-f97d-4d8a-a01f-01144400f76c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.119731 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d8d20de0-f97d-4d8a-a01f-01144400f76c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.119758 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d8d20de0-f97d-4d8a-a01f-01144400f76c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.119788 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.119825 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d8d20de0-f97d-4d8a-a01f-01144400f76c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.119880 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8d20de0-f97d-4d8a-a01f-01144400f76c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.119926 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d8d20de0-f97d-4d8a-a01f-01144400f76c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.119968 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d8d20de0-f97d-4d8a-a01f-01144400f76c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.119994 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d8d20de0-f97d-4d8a-a01f-01144400f76c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.120016 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkcbr\" (UniqueName: \"kubernetes.io/projected/d8d20de0-f97d-4d8a-a01f-01144400f76c-kube-api-access-wkcbr\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.120107 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.120806 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d8d20de0-f97d-4d8a-a01f-01144400f76c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.121047 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d8d20de0-f97d-4d8a-a01f-01144400f76c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.121221 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d8d20de0-f97d-4d8a-a01f-01144400f76c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.121541 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8d20de0-f97d-4d8a-a01f-01144400f76c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.122235 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d8d20de0-f97d-4d8a-a01f-01144400f76c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.125351 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d8d20de0-f97d-4d8a-a01f-01144400f76c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.127790 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d8d20de0-f97d-4d8a-a01f-01144400f76c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.128554 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d8d20de0-f97d-4d8a-a01f-01144400f76c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.135362 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d8d20de0-f97d-4d8a-a01f-01144400f76c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.144601 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkcbr\" (UniqueName: \"kubernetes.io/projected/d8d20de0-f97d-4d8a-a01f-01144400f76c-kube-api-access-wkcbr\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.157852 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d8d20de0-f97d-4d8a-a01f-01144400f76c\") " pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.165345 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.193078 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b7a97b-2d62-4b05-84f6-fc720ce9c672" path="/var/lib/kubelet/pods/71b7a97b-2d62-4b05-84f6-fc720ce9c672/volumes" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.194337 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8005e2e7-ed00-4af1-be65-12638ce3a9f9" path="/var/lib/kubelet/pods/8005e2e7-ed00-4af1-be65-12638ce3a9f9/volumes" Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.465792 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"27279979-e584-4689-893b-6357ed920fef","Type":"ContainerStarted","Data":"799ae375d2d304e766d7bd09d1af561118f73e5af3864f92969ed6bab1536490"} Jan 31 09:48:31 crc kubenswrapper[4992]: I0131 09:48:31.596306 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 31 09:48:31 crc kubenswrapper[4992]: W0131 09:48:31.596303 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8d20de0_f97d_4d8a_a01f_01144400f76c.slice/crio-101516c2f14fa5d220183ee206c0b9443ce92f8ed052d9d8eef0a5b42c37a9ee WatchSource:0}: Error finding container 101516c2f14fa5d220183ee206c0b9443ce92f8ed052d9d8eef0a5b42c37a9ee: Status 404 returned error can't find the container with id 101516c2f14fa5d220183ee206c0b9443ce92f8ed052d9d8eef0a5b42c37a9ee Jan 31 09:48:32 crc kubenswrapper[4992]: I0131 09:48:32.474470 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d8d20de0-f97d-4d8a-a01f-01144400f76c","Type":"ContainerStarted","Data":"101516c2f14fa5d220183ee206c0b9443ce92f8ed052d9d8eef0a5b42c37a9ee"} Jan 31 09:48:32 crc kubenswrapper[4992]: I0131 09:48:32.475919 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"27279979-e584-4689-893b-6357ed920fef","Type":"ContainerStarted","Data":"1ec5ac2613784fc640f6ec2c15a18b396f116753d24ce942221b1607e285ff27"} Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.265095 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-z5jgz"] Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.267338 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.279145 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-z5jgz"] Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.283141 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.365688 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-config\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.365726 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.365796 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.366155 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx92c\" (UniqueName: \"kubernetes.io/projected/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-kube-api-access-sx92c\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.366232 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-dns-svc\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.366363 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.467774 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.467905 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx92c\" (UniqueName: \"kubernetes.io/projected/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-kube-api-access-sx92c\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.467935 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-dns-svc\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.467976 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.468024 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-config\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.468053 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.468906 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.469027 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.469172 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.469231 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-dns-svc\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.469314 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-config\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.487510 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d8d20de0-f97d-4d8a-a01f-01144400f76c","Type":"ContainerStarted","Data":"f8b5e4bb8f6b2263a82bf3bb3dd4f4e21bc9314d0c2c7a52928a3f0769616cba"} Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.492030 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx92c\" (UniqueName: \"kubernetes.io/projected/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-kube-api-access-sx92c\") pod \"dnsmasq-dns-578b8d767c-z5jgz\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:33 crc kubenswrapper[4992]: I0131 09:48:33.598139 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:34 crc kubenswrapper[4992]: I0131 09:48:34.067243 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-z5jgz"] Jan 31 09:48:34 crc kubenswrapper[4992]: W0131 09:48:34.074665 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod731ad5fb_df90_44b4_9ebc_ad2e4aa13622.slice/crio-ae32090c3e53fc5c296385fd0cb6c9a9711616400f6145daf365b3ad8a8ee862 WatchSource:0}: Error finding container ae32090c3e53fc5c296385fd0cb6c9a9711616400f6145daf365b3ad8a8ee862: Status 404 returned error can't find the container with id ae32090c3e53fc5c296385fd0cb6c9a9711616400f6145daf365b3ad8a8ee862 Jan 31 09:48:34 crc kubenswrapper[4992]: I0131 09:48:34.495621 4992 generic.go:334] "Generic (PLEG): container finished" podID="731ad5fb-df90-44b4-9ebc-ad2e4aa13622" containerID="da70ee3745082dacfe40b51272404955e8de3b40669eae7ffe55c51e76802af4" exitCode=0 Jan 31 09:48:34 crc kubenswrapper[4992]: I0131 09:48:34.495675 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" event={"ID":"731ad5fb-df90-44b4-9ebc-ad2e4aa13622","Type":"ContainerDied","Data":"da70ee3745082dacfe40b51272404955e8de3b40669eae7ffe55c51e76802af4"} Jan 31 09:48:34 crc kubenswrapper[4992]: I0131 09:48:34.495983 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" event={"ID":"731ad5fb-df90-44b4-9ebc-ad2e4aa13622","Type":"ContainerStarted","Data":"ae32090c3e53fc5c296385fd0cb6c9a9711616400f6145daf365b3ad8a8ee862"} Jan 31 09:48:35 crc kubenswrapper[4992]: I0131 09:48:35.512410 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" event={"ID":"731ad5fb-df90-44b4-9ebc-ad2e4aa13622","Type":"ContainerStarted","Data":"7c9778566e72dd10fbf5fe930be43f5d3eeb2000852b89e9ba5c946c0f7f6a6f"} Jan 31 09:48:35 crc kubenswrapper[4992]: I0131 09:48:35.512835 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:35 crc kubenswrapper[4992]: I0131 09:48:35.538524 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" podStartSLOduration=2.53850718 podStartE2EDuration="2.53850718s" podCreationTimestamp="2026-01-31 09:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:48:35.530183291 +0000 UTC m=+1411.501575278" watchObservedRunningTime="2026-01-31 09:48:35.53850718 +0000 UTC m=+1411.509899167" Jan 31 09:48:43 crc kubenswrapper[4992]: I0131 09:48:43.600728 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:43 crc kubenswrapper[4992]: I0131 09:48:43.675947 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-lnws2"] Jan 31 09:48:43 crc kubenswrapper[4992]: I0131 09:48:43.677912 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" podUID="5e4678f5-97cd-4850-92f1-486ae4ddafda" containerName="dnsmasq-dns" containerID="cri-o://a625a955c99c2209013d94cd55943bb26dcb71830fd84c73b560a9998d348495" gracePeriod=10 Jan 31 09:48:43 crc kubenswrapper[4992]: I0131 09:48:43.867091 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-lqx4l"] Jan 31 09:48:43 crc kubenswrapper[4992]: I0131 09:48:43.886591 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:43 crc kubenswrapper[4992]: I0131 09:48:43.896585 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-lqx4l"] Jan 31 09:48:43 crc kubenswrapper[4992]: I0131 09:48:43.977378 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-ovsdbserver-sb\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:43 crc kubenswrapper[4992]: I0131 09:48:43.977453 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mtnx\" (UniqueName: \"kubernetes.io/projected/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-kube-api-access-5mtnx\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:43 crc kubenswrapper[4992]: I0131 09:48:43.977563 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-ovsdbserver-nb\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:43 crc kubenswrapper[4992]: I0131 09:48:43.977623 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-config\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:43 crc kubenswrapper[4992]: I0131 09:48:43.977812 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-dns-svc\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:43 crc kubenswrapper[4992]: I0131 09:48:43.977943 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-openstack-edpm-ipam\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.079693 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-dns-svc\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.080083 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-openstack-edpm-ipam\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.080139 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-ovsdbserver-sb\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.080169 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mtnx\" (UniqueName: \"kubernetes.io/projected/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-kube-api-access-5mtnx\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.080242 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-ovsdbserver-nb\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.080315 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-config\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.081202 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-openstack-edpm-ipam\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.081456 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-ovsdbserver-nb\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.081487 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-ovsdbserver-sb\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.081886 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-dns-svc\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.082134 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-config\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.103820 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mtnx\" (UniqueName: \"kubernetes.io/projected/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-kube-api-access-5mtnx\") pod \"dnsmasq-dns-fbc59fbb7-lqx4l\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.186119 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.218925 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.283340 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-dns-svc\") pod \"5e4678f5-97cd-4850-92f1-486ae4ddafda\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.283457 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-config\") pod \"5e4678f5-97cd-4850-92f1-486ae4ddafda\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.283525 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6mx7\" (UniqueName: \"kubernetes.io/projected/5e4678f5-97cd-4850-92f1-486ae4ddafda-kube-api-access-v6mx7\") pod \"5e4678f5-97cd-4850-92f1-486ae4ddafda\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.283592 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-ovsdbserver-nb\") pod \"5e4678f5-97cd-4850-92f1-486ae4ddafda\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.283623 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-ovsdbserver-sb\") pod \"5e4678f5-97cd-4850-92f1-486ae4ddafda\" (UID: \"5e4678f5-97cd-4850-92f1-486ae4ddafda\") " Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.288008 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e4678f5-97cd-4850-92f1-486ae4ddafda-kube-api-access-v6mx7" (OuterVolumeSpecName: "kube-api-access-v6mx7") pod "5e4678f5-97cd-4850-92f1-486ae4ddafda" (UID: "5e4678f5-97cd-4850-92f1-486ae4ddafda"). InnerVolumeSpecName "kube-api-access-v6mx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.333012 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-config" (OuterVolumeSpecName: "config") pod "5e4678f5-97cd-4850-92f1-486ae4ddafda" (UID: "5e4678f5-97cd-4850-92f1-486ae4ddafda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.341302 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e4678f5-97cd-4850-92f1-486ae4ddafda" (UID: "5e4678f5-97cd-4850-92f1-486ae4ddafda"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.348593 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e4678f5-97cd-4850-92f1-486ae4ddafda" (UID: "5e4678f5-97cd-4850-92f1-486ae4ddafda"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.361489 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e4678f5-97cd-4850-92f1-486ae4ddafda" (UID: "5e4678f5-97cd-4850-92f1-486ae4ddafda"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.387867 4992 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.387893 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.387902 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6mx7\" (UniqueName: \"kubernetes.io/projected/5e4678f5-97cd-4850-92f1-486ae4ddafda-kube-api-access-v6mx7\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.387914 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.387924 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e4678f5-97cd-4850-92f1-486ae4ddafda-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.609977 4992 generic.go:334] "Generic (PLEG): container finished" podID="5e4678f5-97cd-4850-92f1-486ae4ddafda" containerID="a625a955c99c2209013d94cd55943bb26dcb71830fd84c73b560a9998d348495" exitCode=0 Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.610019 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" event={"ID":"5e4678f5-97cd-4850-92f1-486ae4ddafda","Type":"ContainerDied","Data":"a625a955c99c2209013d94cd55943bb26dcb71830fd84c73b560a9998d348495"} Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.610047 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" event={"ID":"5e4678f5-97cd-4850-92f1-486ae4ddafda","Type":"ContainerDied","Data":"68eba168a4a0ea59a3436b4d95a2884e8520680bf22e399f3db163833aa0859d"} Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.610041 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-lnws2" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.610060 4992 scope.go:117] "RemoveContainer" containerID="a625a955c99c2209013d94cd55943bb26dcb71830fd84c73b560a9998d348495" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.632595 4992 scope.go:117] "RemoveContainer" containerID="bc53b673f0d554e010ea529addf1e8a8504d1ee1b837dc48c1c9d2a37e332347" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.647394 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-lnws2"] Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.657165 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-lnws2"] Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.664315 4992 scope.go:117] "RemoveContainer" containerID="a625a955c99c2209013d94cd55943bb26dcb71830fd84c73b560a9998d348495" Jan 31 09:48:44 crc kubenswrapper[4992]: E0131 09:48:44.665213 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a625a955c99c2209013d94cd55943bb26dcb71830fd84c73b560a9998d348495\": container with ID starting with a625a955c99c2209013d94cd55943bb26dcb71830fd84c73b560a9998d348495 not found: ID does not exist" containerID="a625a955c99c2209013d94cd55943bb26dcb71830fd84c73b560a9998d348495" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.665252 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a625a955c99c2209013d94cd55943bb26dcb71830fd84c73b560a9998d348495"} err="failed to get container status \"a625a955c99c2209013d94cd55943bb26dcb71830fd84c73b560a9998d348495\": rpc error: code = NotFound desc = could not find container \"a625a955c99c2209013d94cd55943bb26dcb71830fd84c73b560a9998d348495\": container with ID starting with a625a955c99c2209013d94cd55943bb26dcb71830fd84c73b560a9998d348495 not found: ID does not exist" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.665275 4992 scope.go:117] "RemoveContainer" containerID="bc53b673f0d554e010ea529addf1e8a8504d1ee1b837dc48c1c9d2a37e332347" Jan 31 09:48:44 crc kubenswrapper[4992]: E0131 09:48:44.665854 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc53b673f0d554e010ea529addf1e8a8504d1ee1b837dc48c1c9d2a37e332347\": container with ID starting with bc53b673f0d554e010ea529addf1e8a8504d1ee1b837dc48c1c9d2a37e332347 not found: ID does not exist" containerID="bc53b673f0d554e010ea529addf1e8a8504d1ee1b837dc48c1c9d2a37e332347" Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.665882 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc53b673f0d554e010ea529addf1e8a8504d1ee1b837dc48c1c9d2a37e332347"} err="failed to get container status \"bc53b673f0d554e010ea529addf1e8a8504d1ee1b837dc48c1c9d2a37e332347\": rpc error: code = NotFound desc = could not find container \"bc53b673f0d554e010ea529addf1e8a8504d1ee1b837dc48c1c9d2a37e332347\": container with ID starting with bc53b673f0d554e010ea529addf1e8a8504d1ee1b837dc48c1c9d2a37e332347 not found: ID does not exist" Jan 31 09:48:44 crc kubenswrapper[4992]: W0131 09:48:44.671051 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dfb628f_f3c1_402c_8ff0_3c52f32003e0.slice/crio-02f753d3e16a57b754cf55b25f92c6c3fb7bfd977ef41ff630c709692cd6770a WatchSource:0}: Error finding container 02f753d3e16a57b754cf55b25f92c6c3fb7bfd977ef41ff630c709692cd6770a: Status 404 returned error can't find the container with id 02f753d3e16a57b754cf55b25f92c6c3fb7bfd977ef41ff630c709692cd6770a Jan 31 09:48:44 crc kubenswrapper[4992]: I0131 09:48:44.673526 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-lqx4l"] Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.194074 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e4678f5-97cd-4850-92f1-486ae4ddafda" path="/var/lib/kubelet/pods/5e4678f5-97cd-4850-92f1-486ae4ddafda/volumes" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.301534 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.301602 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.301654 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.302411 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85b7e8954b104f8b7761c24a9e9d822599579a66efc36412ab6a9d3f1890fe38"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.302522 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://85b7e8954b104f8b7761c24a9e9d822599579a66efc36412ab6a9d3f1890fe38" gracePeriod=600 Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.537632 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-96l8m"] Jan 31 09:48:45 crc kubenswrapper[4992]: E0131 09:48:45.537999 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4678f5-97cd-4850-92f1-486ae4ddafda" containerName="init" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.538015 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4678f5-97cd-4850-92f1-486ae4ddafda" containerName="init" Jan 31 09:48:45 crc kubenswrapper[4992]: E0131 09:48:45.538044 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4678f5-97cd-4850-92f1-486ae4ddafda" containerName="dnsmasq-dns" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.538050 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4678f5-97cd-4850-92f1-486ae4ddafda" containerName="dnsmasq-dns" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.538240 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e4678f5-97cd-4850-92f1-486ae4ddafda" containerName="dnsmasq-dns" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.539376 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.552512 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-96l8m"] Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.620436 4992 generic.go:334] "Generic (PLEG): container finished" podID="6dfb628f-f3c1-402c-8ff0-3c52f32003e0" containerID="c54c49ed0068a6cad171837fb3681f86c0ba9b69fce3443480c042df2571dcda" exitCode=0 Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.620776 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" event={"ID":"6dfb628f-f3c1-402c-8ff0-3c52f32003e0","Type":"ContainerDied","Data":"c54c49ed0068a6cad171837fb3681f86c0ba9b69fce3443480c042df2571dcda"} Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.620825 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" event={"ID":"6dfb628f-f3c1-402c-8ff0-3c52f32003e0","Type":"ContainerStarted","Data":"02f753d3e16a57b754cf55b25f92c6c3fb7bfd977ef41ff630c709692cd6770a"} Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.627699 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="85b7e8954b104f8b7761c24a9e9d822599579a66efc36412ab6a9d3f1890fe38" exitCode=0 Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.627775 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"85b7e8954b104f8b7761c24a9e9d822599579a66efc36412ab6a9d3f1890fe38"} Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.627963 4992 scope.go:117] "RemoveContainer" containerID="eefc220641844057c58f4645845ce2f51a73e101cb77d772da4c569d245be5c5" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.709459 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7485752f-e231-4c6a-a028-1c4ef852b439-catalog-content\") pod \"redhat-operators-96l8m\" (UID: \"7485752f-e231-4c6a-a028-1c4ef852b439\") " pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.709570 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrwhw\" (UniqueName: \"kubernetes.io/projected/7485752f-e231-4c6a-a028-1c4ef852b439-kube-api-access-zrwhw\") pod \"redhat-operators-96l8m\" (UID: \"7485752f-e231-4c6a-a028-1c4ef852b439\") " pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.709626 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7485752f-e231-4c6a-a028-1c4ef852b439-utilities\") pod \"redhat-operators-96l8m\" (UID: \"7485752f-e231-4c6a-a028-1c4ef852b439\") " pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.810879 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7485752f-e231-4c6a-a028-1c4ef852b439-catalog-content\") pod \"redhat-operators-96l8m\" (UID: \"7485752f-e231-4c6a-a028-1c4ef852b439\") " pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.810952 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrwhw\" (UniqueName: \"kubernetes.io/projected/7485752f-e231-4c6a-a028-1c4ef852b439-kube-api-access-zrwhw\") pod \"redhat-operators-96l8m\" (UID: \"7485752f-e231-4c6a-a028-1c4ef852b439\") " pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.810999 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7485752f-e231-4c6a-a028-1c4ef852b439-utilities\") pod \"redhat-operators-96l8m\" (UID: \"7485752f-e231-4c6a-a028-1c4ef852b439\") " pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.812299 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7485752f-e231-4c6a-a028-1c4ef852b439-utilities\") pod \"redhat-operators-96l8m\" (UID: \"7485752f-e231-4c6a-a028-1c4ef852b439\") " pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.812370 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7485752f-e231-4c6a-a028-1c4ef852b439-catalog-content\") pod \"redhat-operators-96l8m\" (UID: \"7485752f-e231-4c6a-a028-1c4ef852b439\") " pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.843318 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrwhw\" (UniqueName: \"kubernetes.io/projected/7485752f-e231-4c6a-a028-1c4ef852b439-kube-api-access-zrwhw\") pod \"redhat-operators-96l8m\" (UID: \"7485752f-e231-4c6a-a028-1c4ef852b439\") " pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:45 crc kubenswrapper[4992]: I0131 09:48:45.891292 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:46 crc kubenswrapper[4992]: I0131 09:48:46.377590 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-96l8m"] Jan 31 09:48:46 crc kubenswrapper[4992]: W0131 09:48:46.379647 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7485752f_e231_4c6a_a028_1c4ef852b439.slice/crio-fa76625e228daf6e3511eecdb3ae91b96f63634898c0bfdadef8ef2ae74d7b10 WatchSource:0}: Error finding container fa76625e228daf6e3511eecdb3ae91b96f63634898c0bfdadef8ef2ae74d7b10: Status 404 returned error can't find the container with id fa76625e228daf6e3511eecdb3ae91b96f63634898c0bfdadef8ef2ae74d7b10 Jan 31 09:48:46 crc kubenswrapper[4992]: I0131 09:48:46.637053 4992 generic.go:334] "Generic (PLEG): container finished" podID="7485752f-e231-4c6a-a028-1c4ef852b439" containerID="5e27cc7b47c1debbd0d581405c078da9be6a67ec4fc38c3a97d0f77175b74cea" exitCode=0 Jan 31 09:48:46 crc kubenswrapper[4992]: I0131 09:48:46.637181 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96l8m" event={"ID":"7485752f-e231-4c6a-a028-1c4ef852b439","Type":"ContainerDied","Data":"5e27cc7b47c1debbd0d581405c078da9be6a67ec4fc38c3a97d0f77175b74cea"} Jan 31 09:48:46 crc kubenswrapper[4992]: I0131 09:48:46.637462 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96l8m" event={"ID":"7485752f-e231-4c6a-a028-1c4ef852b439","Type":"ContainerStarted","Data":"fa76625e228daf6e3511eecdb3ae91b96f63634898c0bfdadef8ef2ae74d7b10"} Jan 31 09:48:46 crc kubenswrapper[4992]: I0131 09:48:46.638934 4992 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:48:46 crc kubenswrapper[4992]: I0131 09:48:46.640531 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9"} Jan 31 09:48:46 crc kubenswrapper[4992]: I0131 09:48:46.642904 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" event={"ID":"6dfb628f-f3c1-402c-8ff0-3c52f32003e0","Type":"ContainerStarted","Data":"5193f7dad3d75f00d9be15d7afc9cab70355890a8643bfd2353d291424d99341"} Jan 31 09:48:46 crc kubenswrapper[4992]: I0131 09:48:46.643337 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:46 crc kubenswrapper[4992]: I0131 09:48:46.702105 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" podStartSLOduration=3.7020883749999998 podStartE2EDuration="3.702088375s" podCreationTimestamp="2026-01-31 09:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:48:46.698619135 +0000 UTC m=+1422.670011132" watchObservedRunningTime="2026-01-31 09:48:46.702088375 +0000 UTC m=+1422.673480362" Jan 31 09:48:47 crc kubenswrapper[4992]: I0131 09:48:47.652715 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96l8m" event={"ID":"7485752f-e231-4c6a-a028-1c4ef852b439","Type":"ContainerStarted","Data":"2398ab8932c6935eeb8c5cdc2d2cdf6cd08569720dc1d13a9bd506803717d2ce"} Jan 31 09:48:48 crc kubenswrapper[4992]: I0131 09:48:48.663401 4992 generic.go:334] "Generic (PLEG): container finished" podID="7485752f-e231-4c6a-a028-1c4ef852b439" containerID="2398ab8932c6935eeb8c5cdc2d2cdf6cd08569720dc1d13a9bd506803717d2ce" exitCode=0 Jan 31 09:48:48 crc kubenswrapper[4992]: I0131 09:48:48.663485 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96l8m" event={"ID":"7485752f-e231-4c6a-a028-1c4ef852b439","Type":"ContainerDied","Data":"2398ab8932c6935eeb8c5cdc2d2cdf6cd08569720dc1d13a9bd506803717d2ce"} Jan 31 09:48:49 crc kubenswrapper[4992]: I0131 09:48:49.674029 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96l8m" event={"ID":"7485752f-e231-4c6a-a028-1c4ef852b439","Type":"ContainerStarted","Data":"abedb3939d0f1beb44aff78bec09c630c4bbd354775193c99cdd13277d6c2f67"} Jan 31 09:48:49 crc kubenswrapper[4992]: I0131 09:48:49.690956 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-96l8m" podStartSLOduration=2.235292811 podStartE2EDuration="4.690939304s" podCreationTimestamp="2026-01-31 09:48:45 +0000 UTC" firstStartedPulling="2026-01-31 09:48:46.638577087 +0000 UTC m=+1422.609969074" lastFinishedPulling="2026-01-31 09:48:49.09422358 +0000 UTC m=+1425.065615567" observedRunningTime="2026-01-31 09:48:49.689711729 +0000 UTC m=+1425.661103736" watchObservedRunningTime="2026-01-31 09:48:49.690939304 +0000 UTC m=+1425.662331291" Jan 31 09:48:54 crc kubenswrapper[4992]: I0131 09:48:54.220681 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 09:48:54 crc kubenswrapper[4992]: I0131 09:48:54.337470 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-z5jgz"] Jan 31 09:48:54 crc kubenswrapper[4992]: I0131 09:48:54.337767 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" podUID="731ad5fb-df90-44b4-9ebc-ad2e4aa13622" containerName="dnsmasq-dns" containerID="cri-o://7c9778566e72dd10fbf5fe930be43f5d3eeb2000852b89e9ba5c946c0f7f6a6f" gracePeriod=10 Jan 31 09:48:54 crc kubenswrapper[4992]: I0131 09:48:54.725236 4992 generic.go:334] "Generic (PLEG): container finished" podID="731ad5fb-df90-44b4-9ebc-ad2e4aa13622" containerID="7c9778566e72dd10fbf5fe930be43f5d3eeb2000852b89e9ba5c946c0f7f6a6f" exitCode=0 Jan 31 09:48:54 crc kubenswrapper[4992]: I0131 09:48:54.725528 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" event={"ID":"731ad5fb-df90-44b4-9ebc-ad2e4aa13622","Type":"ContainerDied","Data":"7c9778566e72dd10fbf5fe930be43f5d3eeb2000852b89e9ba5c946c0f7f6a6f"} Jan 31 09:48:54 crc kubenswrapper[4992]: I0131 09:48:54.827920 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.002434 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-config\") pod \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.003574 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-ovsdbserver-nb\") pod \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.003618 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx92c\" (UniqueName: \"kubernetes.io/projected/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-kube-api-access-sx92c\") pod \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.003670 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-ovsdbserver-sb\") pod \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.003728 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-dns-svc\") pod \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.003777 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-openstack-edpm-ipam\") pod \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\" (UID: \"731ad5fb-df90-44b4-9ebc-ad2e4aa13622\") " Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.008148 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-kube-api-access-sx92c" (OuterVolumeSpecName: "kube-api-access-sx92c") pod "731ad5fb-df90-44b4-9ebc-ad2e4aa13622" (UID: "731ad5fb-df90-44b4-9ebc-ad2e4aa13622"). InnerVolumeSpecName "kube-api-access-sx92c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.066035 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "731ad5fb-df90-44b4-9ebc-ad2e4aa13622" (UID: "731ad5fb-df90-44b4-9ebc-ad2e4aa13622"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.076952 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "731ad5fb-df90-44b4-9ebc-ad2e4aa13622" (UID: "731ad5fb-df90-44b4-9ebc-ad2e4aa13622"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.077993 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "731ad5fb-df90-44b4-9ebc-ad2e4aa13622" (UID: "731ad5fb-df90-44b4-9ebc-ad2e4aa13622"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.090782 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-config" (OuterVolumeSpecName: "config") pod "731ad5fb-df90-44b4-9ebc-ad2e4aa13622" (UID: "731ad5fb-df90-44b4-9ebc-ad2e4aa13622"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.093969 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "731ad5fb-df90-44b4-9ebc-ad2e4aa13622" (UID: "731ad5fb-df90-44b4-9ebc-ad2e4aa13622"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.106156 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.106184 4992 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.106195 4992 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.106205 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.106215 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.106225 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx92c\" (UniqueName: \"kubernetes.io/projected/731ad5fb-df90-44b4-9ebc-ad2e4aa13622-kube-api-access-sx92c\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.735964 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" event={"ID":"731ad5fb-df90-44b4-9ebc-ad2e4aa13622","Type":"ContainerDied","Data":"ae32090c3e53fc5c296385fd0cb6c9a9711616400f6145daf365b3ad8a8ee862"} Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.736231 4992 scope.go:117] "RemoveContainer" containerID="7c9778566e72dd10fbf5fe930be43f5d3eeb2000852b89e9ba5c946c0f7f6a6f" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.736028 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-z5jgz" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.760916 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-z5jgz"] Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.767149 4992 scope.go:117] "RemoveContainer" containerID="da70ee3745082dacfe40b51272404955e8de3b40669eae7ffe55c51e76802af4" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.768309 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-z5jgz"] Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.892268 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.892305 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:55 crc kubenswrapper[4992]: I0131 09:48:55.935795 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:56 crc kubenswrapper[4992]: I0131 09:48:56.803285 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:56 crc kubenswrapper[4992]: I0131 09:48:56.854596 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-96l8m"] Jan 31 09:48:57 crc kubenswrapper[4992]: I0131 09:48:57.193536 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="731ad5fb-df90-44b4-9ebc-ad2e4aa13622" path="/var/lib/kubelet/pods/731ad5fb-df90-44b4-9ebc-ad2e4aa13622/volumes" Jan 31 09:48:58 crc kubenswrapper[4992]: I0131 09:48:58.777533 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-96l8m" podUID="7485752f-e231-4c6a-a028-1c4ef852b439" containerName="registry-server" containerID="cri-o://abedb3939d0f1beb44aff78bec09c630c4bbd354775193c99cdd13277d6c2f67" gracePeriod=2 Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.231637 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.284153 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrwhw\" (UniqueName: \"kubernetes.io/projected/7485752f-e231-4c6a-a028-1c4ef852b439-kube-api-access-zrwhw\") pod \"7485752f-e231-4c6a-a028-1c4ef852b439\" (UID: \"7485752f-e231-4c6a-a028-1c4ef852b439\") " Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.284229 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7485752f-e231-4c6a-a028-1c4ef852b439-catalog-content\") pod \"7485752f-e231-4c6a-a028-1c4ef852b439\" (UID: \"7485752f-e231-4c6a-a028-1c4ef852b439\") " Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.284298 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7485752f-e231-4c6a-a028-1c4ef852b439-utilities\") pod \"7485752f-e231-4c6a-a028-1c4ef852b439\" (UID: \"7485752f-e231-4c6a-a028-1c4ef852b439\") " Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.286881 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7485752f-e231-4c6a-a028-1c4ef852b439-utilities" (OuterVolumeSpecName: "utilities") pod "7485752f-e231-4c6a-a028-1c4ef852b439" (UID: "7485752f-e231-4c6a-a028-1c4ef852b439"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.310902 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7485752f-e231-4c6a-a028-1c4ef852b439-kube-api-access-zrwhw" (OuterVolumeSpecName: "kube-api-access-zrwhw") pod "7485752f-e231-4c6a-a028-1c4ef852b439" (UID: "7485752f-e231-4c6a-a028-1c4ef852b439"). InnerVolumeSpecName "kube-api-access-zrwhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.387161 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrwhw\" (UniqueName: \"kubernetes.io/projected/7485752f-e231-4c6a-a028-1c4ef852b439-kube-api-access-zrwhw\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.387197 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7485752f-e231-4c6a-a028-1c4ef852b439-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.421857 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7485752f-e231-4c6a-a028-1c4ef852b439-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7485752f-e231-4c6a-a028-1c4ef852b439" (UID: "7485752f-e231-4c6a-a028-1c4ef852b439"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.488925 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7485752f-e231-4c6a-a028-1c4ef852b439-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.789746 4992 generic.go:334] "Generic (PLEG): container finished" podID="7485752f-e231-4c6a-a028-1c4ef852b439" containerID="abedb3939d0f1beb44aff78bec09c630c4bbd354775193c99cdd13277d6c2f67" exitCode=0 Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.789792 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96l8m" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.789792 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96l8m" event={"ID":"7485752f-e231-4c6a-a028-1c4ef852b439","Type":"ContainerDied","Data":"abedb3939d0f1beb44aff78bec09c630c4bbd354775193c99cdd13277d6c2f67"} Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.789991 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96l8m" event={"ID":"7485752f-e231-4c6a-a028-1c4ef852b439","Type":"ContainerDied","Data":"fa76625e228daf6e3511eecdb3ae91b96f63634898c0bfdadef8ef2ae74d7b10"} Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.790048 4992 scope.go:117] "RemoveContainer" containerID="abedb3939d0f1beb44aff78bec09c630c4bbd354775193c99cdd13277d6c2f67" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.821110 4992 scope.go:117] "RemoveContainer" containerID="2398ab8932c6935eeb8c5cdc2d2cdf6cd08569720dc1d13a9bd506803717d2ce" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.828572 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-96l8m"] Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.842250 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-96l8m"] Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.857292 4992 scope.go:117] "RemoveContainer" containerID="5e27cc7b47c1debbd0d581405c078da9be6a67ec4fc38c3a97d0f77175b74cea" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.886387 4992 scope.go:117] "RemoveContainer" containerID="abedb3939d0f1beb44aff78bec09c630c4bbd354775193c99cdd13277d6c2f67" Jan 31 09:48:59 crc kubenswrapper[4992]: E0131 09:48:59.886785 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abedb3939d0f1beb44aff78bec09c630c4bbd354775193c99cdd13277d6c2f67\": container with ID starting with abedb3939d0f1beb44aff78bec09c630c4bbd354775193c99cdd13277d6c2f67 not found: ID does not exist" containerID="abedb3939d0f1beb44aff78bec09c630c4bbd354775193c99cdd13277d6c2f67" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.886820 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abedb3939d0f1beb44aff78bec09c630c4bbd354775193c99cdd13277d6c2f67"} err="failed to get container status \"abedb3939d0f1beb44aff78bec09c630c4bbd354775193c99cdd13277d6c2f67\": rpc error: code = NotFound desc = could not find container \"abedb3939d0f1beb44aff78bec09c630c4bbd354775193c99cdd13277d6c2f67\": container with ID starting with abedb3939d0f1beb44aff78bec09c630c4bbd354775193c99cdd13277d6c2f67 not found: ID does not exist" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.886846 4992 scope.go:117] "RemoveContainer" containerID="2398ab8932c6935eeb8c5cdc2d2cdf6cd08569720dc1d13a9bd506803717d2ce" Jan 31 09:48:59 crc kubenswrapper[4992]: E0131 09:48:59.887220 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2398ab8932c6935eeb8c5cdc2d2cdf6cd08569720dc1d13a9bd506803717d2ce\": container with ID starting with 2398ab8932c6935eeb8c5cdc2d2cdf6cd08569720dc1d13a9bd506803717d2ce not found: ID does not exist" containerID="2398ab8932c6935eeb8c5cdc2d2cdf6cd08569720dc1d13a9bd506803717d2ce" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.887246 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2398ab8932c6935eeb8c5cdc2d2cdf6cd08569720dc1d13a9bd506803717d2ce"} err="failed to get container status \"2398ab8932c6935eeb8c5cdc2d2cdf6cd08569720dc1d13a9bd506803717d2ce\": rpc error: code = NotFound desc = could not find container \"2398ab8932c6935eeb8c5cdc2d2cdf6cd08569720dc1d13a9bd506803717d2ce\": container with ID starting with 2398ab8932c6935eeb8c5cdc2d2cdf6cd08569720dc1d13a9bd506803717d2ce not found: ID does not exist" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.887262 4992 scope.go:117] "RemoveContainer" containerID="5e27cc7b47c1debbd0d581405c078da9be6a67ec4fc38c3a97d0f77175b74cea" Jan 31 09:48:59 crc kubenswrapper[4992]: E0131 09:48:59.887633 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e27cc7b47c1debbd0d581405c078da9be6a67ec4fc38c3a97d0f77175b74cea\": container with ID starting with 5e27cc7b47c1debbd0d581405c078da9be6a67ec4fc38c3a97d0f77175b74cea not found: ID does not exist" containerID="5e27cc7b47c1debbd0d581405c078da9be6a67ec4fc38c3a97d0f77175b74cea" Jan 31 09:48:59 crc kubenswrapper[4992]: I0131 09:48:59.887667 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e27cc7b47c1debbd0d581405c078da9be6a67ec4fc38c3a97d0f77175b74cea"} err="failed to get container status \"5e27cc7b47c1debbd0d581405c078da9be6a67ec4fc38c3a97d0f77175b74cea\": rpc error: code = NotFound desc = could not find container \"5e27cc7b47c1debbd0d581405c078da9be6a67ec4fc38c3a97d0f77175b74cea\": container with ID starting with 5e27cc7b47c1debbd0d581405c078da9be6a67ec4fc38c3a97d0f77175b74cea not found: ID does not exist" Jan 31 09:49:01 crc kubenswrapper[4992]: I0131 09:49:01.192431 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7485752f-e231-4c6a-a028-1c4ef852b439" path="/var/lib/kubelet/pods/7485752f-e231-4c6a-a028-1c4ef852b439/volumes" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.445219 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z"] Jan 31 09:49:04 crc kubenswrapper[4992]: E0131 09:49:04.446239 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="731ad5fb-df90-44b4-9ebc-ad2e4aa13622" containerName="dnsmasq-dns" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.446260 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="731ad5fb-df90-44b4-9ebc-ad2e4aa13622" containerName="dnsmasq-dns" Jan 31 09:49:04 crc kubenswrapper[4992]: E0131 09:49:04.446271 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="731ad5fb-df90-44b4-9ebc-ad2e4aa13622" containerName="init" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.446278 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="731ad5fb-df90-44b4-9ebc-ad2e4aa13622" containerName="init" Jan 31 09:49:04 crc kubenswrapper[4992]: E0131 09:49:04.446292 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7485752f-e231-4c6a-a028-1c4ef852b439" containerName="extract-content" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.446299 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="7485752f-e231-4c6a-a028-1c4ef852b439" containerName="extract-content" Jan 31 09:49:04 crc kubenswrapper[4992]: E0131 09:49:04.446321 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7485752f-e231-4c6a-a028-1c4ef852b439" containerName="registry-server" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.446328 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="7485752f-e231-4c6a-a028-1c4ef852b439" containerName="registry-server" Jan 31 09:49:04 crc kubenswrapper[4992]: E0131 09:49:04.446354 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7485752f-e231-4c6a-a028-1c4ef852b439" containerName="extract-utilities" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.446361 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="7485752f-e231-4c6a-a028-1c4ef852b439" containerName="extract-utilities" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.446572 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="731ad5fb-df90-44b4-9ebc-ad2e4aa13622" containerName="dnsmasq-dns" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.446585 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="7485752f-e231-4c6a-a028-1c4ef852b439" containerName="registry-server" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.447484 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.449944 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.450473 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.450564 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.452526 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.476609 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z"] Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.482221 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.482281 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.482308 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk9tr\" (UniqueName: \"kubernetes.io/projected/627e3db2-cdce-499c-88b6-ec31436246c0-kube-api-access-bk9tr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.482622 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.585785 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.585982 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.586024 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.586057 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk9tr\" (UniqueName: \"kubernetes.io/projected/627e3db2-cdce-499c-88b6-ec31436246c0-kube-api-access-bk9tr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.593936 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.593984 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.595806 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.604213 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk9tr\" (UniqueName: \"kubernetes.io/projected/627e3db2-cdce-499c-88b6-ec31436246c0-kube-api-access-bk9tr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.774910 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.848803 4992 generic.go:334] "Generic (PLEG): container finished" podID="27279979-e584-4689-893b-6357ed920fef" containerID="1ec5ac2613784fc640f6ec2c15a18b396f116753d24ce942221b1607e285ff27" exitCode=0 Jan 31 09:49:04 crc kubenswrapper[4992]: I0131 09:49:04.848847 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"27279979-e584-4689-893b-6357ed920fef","Type":"ContainerDied","Data":"1ec5ac2613784fc640f6ec2c15a18b396f116753d24ce942221b1607e285ff27"} Jan 31 09:49:05 crc kubenswrapper[4992]: I0131 09:49:05.306697 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z"] Jan 31 09:49:05 crc kubenswrapper[4992]: W0131 09:49:05.343658 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod627e3db2_cdce_499c_88b6_ec31436246c0.slice/crio-44bbcc8bf3cab85b59111d451032772ba08558599deb3cda010d7bb9dfa8754a WatchSource:0}: Error finding container 44bbcc8bf3cab85b59111d451032772ba08558599deb3cda010d7bb9dfa8754a: Status 404 returned error can't find the container with id 44bbcc8bf3cab85b59111d451032772ba08558599deb3cda010d7bb9dfa8754a Jan 31 09:49:05 crc kubenswrapper[4992]: I0131 09:49:05.858689 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" event={"ID":"627e3db2-cdce-499c-88b6-ec31436246c0","Type":"ContainerStarted","Data":"44bbcc8bf3cab85b59111d451032772ba08558599deb3cda010d7bb9dfa8754a"} Jan 31 09:49:05 crc kubenswrapper[4992]: I0131 09:49:05.861327 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"27279979-e584-4689-893b-6357ed920fef","Type":"ContainerStarted","Data":"53bd25b895075681e727294c7c0436104c228937137b0f88c55b01fad78635d6"} Jan 31 09:49:05 crc kubenswrapper[4992]: I0131 09:49:05.861620 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 31 09:49:05 crc kubenswrapper[4992]: I0131 09:49:05.862897 4992 generic.go:334] "Generic (PLEG): container finished" podID="d8d20de0-f97d-4d8a-a01f-01144400f76c" containerID="f8b5e4bb8f6b2263a82bf3bb3dd4f4e21bc9314d0c2c7a52928a3f0769616cba" exitCode=0 Jan 31 09:49:05 crc kubenswrapper[4992]: I0131 09:49:05.862931 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d8d20de0-f97d-4d8a-a01f-01144400f76c","Type":"ContainerDied","Data":"f8b5e4bb8f6b2263a82bf3bb3dd4f4e21bc9314d0c2c7a52928a3f0769616cba"} Jan 31 09:49:05 crc kubenswrapper[4992]: I0131 09:49:05.895143 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.89512324 podStartE2EDuration="36.89512324s" podCreationTimestamp="2026-01-31 09:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:49:05.885107914 +0000 UTC m=+1441.856499921" watchObservedRunningTime="2026-01-31 09:49:05.89512324 +0000 UTC m=+1441.866515247" Jan 31 09:49:06 crc kubenswrapper[4992]: I0131 09:49:06.874635 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d8d20de0-f97d-4d8a-a01f-01144400f76c","Type":"ContainerStarted","Data":"45f4ffb98f32754eabf034f5cab62dd022ebbc291d001b0cd681c620df67bb24"} Jan 31 09:49:06 crc kubenswrapper[4992]: I0131 09:49:06.875231 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:49:06 crc kubenswrapper[4992]: I0131 09:49:06.906728 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.906700071 podStartE2EDuration="36.906700071s" podCreationTimestamp="2026-01-31 09:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:49:06.898497487 +0000 UTC m=+1442.869889494" watchObservedRunningTime="2026-01-31 09:49:06.906700071 +0000 UTC m=+1442.878092058" Jan 31 09:49:14 crc kubenswrapper[4992]: I0131 09:49:14.124647 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:49:14 crc kubenswrapper[4992]: I0131 09:49:14.943500 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" event={"ID":"627e3db2-cdce-499c-88b6-ec31436246c0","Type":"ContainerStarted","Data":"d81f5b770b5a441d90c574171ef6df6ba7de1a9b00d5aeb7e305ed5f73af49cb"} Jan 31 09:49:20 crc kubenswrapper[4992]: I0131 09:49:20.215727 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 31 09:49:20 crc kubenswrapper[4992]: I0131 09:49:20.257745 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" podStartSLOduration=7.48131366 podStartE2EDuration="16.257719021s" podCreationTimestamp="2026-01-31 09:49:04 +0000 UTC" firstStartedPulling="2026-01-31 09:49:05.34600516 +0000 UTC m=+1441.317397147" lastFinishedPulling="2026-01-31 09:49:14.122410521 +0000 UTC m=+1450.093802508" observedRunningTime="2026-01-31 09:49:14.972643532 +0000 UTC m=+1450.944035529" watchObservedRunningTime="2026-01-31 09:49:20.257719021 +0000 UTC m=+1456.229111018" Jan 31 09:49:21 crc kubenswrapper[4992]: I0131 09:49:21.168715 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 31 09:49:28 crc kubenswrapper[4992]: I0131 09:49:28.068884 4992 generic.go:334] "Generic (PLEG): container finished" podID="627e3db2-cdce-499c-88b6-ec31436246c0" containerID="d81f5b770b5a441d90c574171ef6df6ba7de1a9b00d5aeb7e305ed5f73af49cb" exitCode=0 Jan 31 09:49:28 crc kubenswrapper[4992]: I0131 09:49:28.068982 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" event={"ID":"627e3db2-cdce-499c-88b6-ec31436246c0","Type":"ContainerDied","Data":"d81f5b770b5a441d90c574171ef6df6ba7de1a9b00d5aeb7e305ed5f73af49cb"} Jan 31 09:49:29 crc kubenswrapper[4992]: I0131 09:49:29.514304 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:29 crc kubenswrapper[4992]: I0131 09:49:29.680769 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-repo-setup-combined-ca-bundle\") pod \"627e3db2-cdce-499c-88b6-ec31436246c0\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " Jan 31 09:49:29 crc kubenswrapper[4992]: I0131 09:49:29.681163 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-inventory\") pod \"627e3db2-cdce-499c-88b6-ec31436246c0\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " Jan 31 09:49:29 crc kubenswrapper[4992]: I0131 09:49:29.681294 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-ssh-key-openstack-edpm-ipam\") pod \"627e3db2-cdce-499c-88b6-ec31436246c0\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " Jan 31 09:49:29 crc kubenswrapper[4992]: I0131 09:49:29.681513 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk9tr\" (UniqueName: \"kubernetes.io/projected/627e3db2-cdce-499c-88b6-ec31436246c0-kube-api-access-bk9tr\") pod \"627e3db2-cdce-499c-88b6-ec31436246c0\" (UID: \"627e3db2-cdce-499c-88b6-ec31436246c0\") " Jan 31 09:49:29 crc kubenswrapper[4992]: I0131 09:49:29.693744 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627e3db2-cdce-499c-88b6-ec31436246c0-kube-api-access-bk9tr" (OuterVolumeSpecName: "kube-api-access-bk9tr") pod "627e3db2-cdce-499c-88b6-ec31436246c0" (UID: "627e3db2-cdce-499c-88b6-ec31436246c0"). InnerVolumeSpecName "kube-api-access-bk9tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:49:29 crc kubenswrapper[4992]: I0131 09:49:29.693749 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "627e3db2-cdce-499c-88b6-ec31436246c0" (UID: "627e3db2-cdce-499c-88b6-ec31436246c0"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:49:29 crc kubenswrapper[4992]: I0131 09:49:29.710318 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-inventory" (OuterVolumeSpecName: "inventory") pod "627e3db2-cdce-499c-88b6-ec31436246c0" (UID: "627e3db2-cdce-499c-88b6-ec31436246c0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:49:29 crc kubenswrapper[4992]: I0131 09:49:29.720928 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "627e3db2-cdce-499c-88b6-ec31436246c0" (UID: "627e3db2-cdce-499c-88b6-ec31436246c0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:49:29 crc kubenswrapper[4992]: I0131 09:49:29.783341 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:49:29 crc kubenswrapper[4992]: I0131 09:49:29.783369 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk9tr\" (UniqueName: \"kubernetes.io/projected/627e3db2-cdce-499c-88b6-ec31436246c0-kube-api-access-bk9tr\") on node \"crc\" DevicePath \"\"" Jan 31 09:49:29 crc kubenswrapper[4992]: I0131 09:49:29.783379 4992 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:49:29 crc kubenswrapper[4992]: I0131 09:49:29.783390 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/627e3db2-cdce-499c-88b6-ec31436246c0-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.144620 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" event={"ID":"627e3db2-cdce-499c-88b6-ec31436246c0","Type":"ContainerDied","Data":"44bbcc8bf3cab85b59111d451032772ba08558599deb3cda010d7bb9dfa8754a"} Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.144678 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44bbcc8bf3cab85b59111d451032772ba08558599deb3cda010d7bb9dfa8754a" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.144713 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.174329 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv"] Jan 31 09:49:30 crc kubenswrapper[4992]: E0131 09:49:30.175301 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627e3db2-cdce-499c-88b6-ec31436246c0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.175402 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="627e3db2-cdce-499c-88b6-ec31436246c0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.175907 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="627e3db2-cdce-499c-88b6-ec31436246c0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.176740 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.180552 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.180872 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.181129 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.182304 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.191499 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv"] Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.295023 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx9zw\" (UniqueName: \"kubernetes.io/projected/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-kube-api-access-kx9zw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.295078 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.295131 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.295185 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.397351 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx9zw\" (UniqueName: \"kubernetes.io/projected/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-kube-api-access-kx9zw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.397404 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.397460 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.397498 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.401214 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.402292 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.417598 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx9zw\" (UniqueName: \"kubernetes.io/projected/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-kube-api-access-kx9zw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.419626 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:49:30 crc kubenswrapper[4992]: I0131 09:49:30.505900 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:49:31 crc kubenswrapper[4992]: I0131 09:49:30.999705 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv"] Jan 31 09:49:31 crc kubenswrapper[4992]: W0131 09:49:31.005471 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e3d4682_1214_4cda_a6d4_07bd6fe3b816.slice/crio-93057734ded6e23300e39cde98ce930c35c22c624135ea67bc522b8195de6fc1 WatchSource:0}: Error finding container 93057734ded6e23300e39cde98ce930c35c22c624135ea67bc522b8195de6fc1: Status 404 returned error can't find the container with id 93057734ded6e23300e39cde98ce930c35c22c624135ea67bc522b8195de6fc1 Jan 31 09:49:31 crc kubenswrapper[4992]: I0131 09:49:31.155231 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" event={"ID":"1e3d4682-1214-4cda-a6d4-07bd6fe3b816","Type":"ContainerStarted","Data":"93057734ded6e23300e39cde98ce930c35c22c624135ea67bc522b8195de6fc1"} Jan 31 09:49:33 crc kubenswrapper[4992]: I0131 09:49:33.196613 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" event={"ID":"1e3d4682-1214-4cda-a6d4-07bd6fe3b816","Type":"ContainerStarted","Data":"cad65664830e188a35e69070e0221ac64d8d22028229b989a10089f47b3d9b86"} Jan 31 09:49:33 crc kubenswrapper[4992]: I0131 09:49:33.207539 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" podStartSLOduration=1.617106742 podStartE2EDuration="3.207523915s" podCreationTimestamp="2026-01-31 09:49:30 +0000 UTC" firstStartedPulling="2026-01-31 09:49:31.007929072 +0000 UTC m=+1466.979321069" lastFinishedPulling="2026-01-31 09:49:32.598346245 +0000 UTC m=+1468.569738242" observedRunningTime="2026-01-31 09:49:33.20386138 +0000 UTC m=+1469.175253387" watchObservedRunningTime="2026-01-31 09:49:33.207523915 +0000 UTC m=+1469.178915902" Jan 31 09:50:15 crc kubenswrapper[4992]: I0131 09:50:15.543303 4992 scope.go:117] "RemoveContainer" containerID="513b36f53645297f742259b45d5286c8090d81f9e11fe4bfb5dfb00eadf6eec0" Jan 31 09:50:15 crc kubenswrapper[4992]: I0131 09:50:15.752944 4992 scope.go:117] "RemoveContainer" containerID="752ccfd32048876456e648088b75041d67eb462b98eb9b039c68910f9ac3eab7" Jan 31 09:50:36 crc kubenswrapper[4992]: I0131 09:50:36.396301 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-grqfh"] Jan 31 09:50:36 crc kubenswrapper[4992]: I0131 09:50:36.398898 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:36 crc kubenswrapper[4992]: I0131 09:50:36.406398 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-grqfh"] Jan 31 09:50:36 crc kubenswrapper[4992]: I0131 09:50:36.494823 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd02b27-2769-4767-90e6-4176ddb33d2f-utilities\") pod \"redhat-marketplace-grqfh\" (UID: \"5dd02b27-2769-4767-90e6-4176ddb33d2f\") " pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:36 crc kubenswrapper[4992]: I0131 09:50:36.495133 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfnhn\" (UniqueName: \"kubernetes.io/projected/5dd02b27-2769-4767-90e6-4176ddb33d2f-kube-api-access-vfnhn\") pod \"redhat-marketplace-grqfh\" (UID: \"5dd02b27-2769-4767-90e6-4176ddb33d2f\") " pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:36 crc kubenswrapper[4992]: I0131 09:50:36.495211 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd02b27-2769-4767-90e6-4176ddb33d2f-catalog-content\") pod \"redhat-marketplace-grqfh\" (UID: \"5dd02b27-2769-4767-90e6-4176ddb33d2f\") " pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:36 crc kubenswrapper[4992]: I0131 09:50:36.596616 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd02b27-2769-4767-90e6-4176ddb33d2f-catalog-content\") pod \"redhat-marketplace-grqfh\" (UID: \"5dd02b27-2769-4767-90e6-4176ddb33d2f\") " pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:36 crc kubenswrapper[4992]: I0131 09:50:36.596740 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd02b27-2769-4767-90e6-4176ddb33d2f-utilities\") pod \"redhat-marketplace-grqfh\" (UID: \"5dd02b27-2769-4767-90e6-4176ddb33d2f\") " pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:36 crc kubenswrapper[4992]: I0131 09:50:36.596770 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfnhn\" (UniqueName: \"kubernetes.io/projected/5dd02b27-2769-4767-90e6-4176ddb33d2f-kube-api-access-vfnhn\") pod \"redhat-marketplace-grqfh\" (UID: \"5dd02b27-2769-4767-90e6-4176ddb33d2f\") " pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:36 crc kubenswrapper[4992]: I0131 09:50:36.597475 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd02b27-2769-4767-90e6-4176ddb33d2f-catalog-content\") pod \"redhat-marketplace-grqfh\" (UID: \"5dd02b27-2769-4767-90e6-4176ddb33d2f\") " pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:36 crc kubenswrapper[4992]: I0131 09:50:36.597686 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd02b27-2769-4767-90e6-4176ddb33d2f-utilities\") pod \"redhat-marketplace-grqfh\" (UID: \"5dd02b27-2769-4767-90e6-4176ddb33d2f\") " pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:36 crc kubenswrapper[4992]: I0131 09:50:36.615256 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfnhn\" (UniqueName: \"kubernetes.io/projected/5dd02b27-2769-4767-90e6-4176ddb33d2f-kube-api-access-vfnhn\") pod \"redhat-marketplace-grqfh\" (UID: \"5dd02b27-2769-4767-90e6-4176ddb33d2f\") " pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:36 crc kubenswrapper[4992]: I0131 09:50:36.719041 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:37 crc kubenswrapper[4992]: I0131 09:50:37.176641 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-grqfh"] Jan 31 09:50:37 crc kubenswrapper[4992]: I0131 09:50:37.831511 4992 generic.go:334] "Generic (PLEG): container finished" podID="5dd02b27-2769-4767-90e6-4176ddb33d2f" containerID="bd5e91281d4ae9b28d902ad8cd2380bc2688e727bc2431fbd70c35748a4d2d70" exitCode=0 Jan 31 09:50:37 crc kubenswrapper[4992]: I0131 09:50:37.831652 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grqfh" event={"ID":"5dd02b27-2769-4767-90e6-4176ddb33d2f","Type":"ContainerDied","Data":"bd5e91281d4ae9b28d902ad8cd2380bc2688e727bc2431fbd70c35748a4d2d70"} Jan 31 09:50:37 crc kubenswrapper[4992]: I0131 09:50:37.831883 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grqfh" event={"ID":"5dd02b27-2769-4767-90e6-4176ddb33d2f","Type":"ContainerStarted","Data":"9fccbf929978b53da6851aa432fc90a58ae2cb8ac8e76dccc98c4cacaf46f16c"} Jan 31 09:50:38 crc kubenswrapper[4992]: E0131 09:50:38.770346 4992 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dd02b27_2769_4767_90e6_4176ddb33d2f.slice/crio-conmon-46ebf887c4b501c1728f251a9ea1728c40981f733a284f26f52b5f51351e3df9.scope\": RecentStats: unable to find data in memory cache]" Jan 31 09:50:38 crc kubenswrapper[4992]: I0131 09:50:38.843495 4992 generic.go:334] "Generic (PLEG): container finished" podID="5dd02b27-2769-4767-90e6-4176ddb33d2f" containerID="46ebf887c4b501c1728f251a9ea1728c40981f733a284f26f52b5f51351e3df9" exitCode=0 Jan 31 09:50:38 crc kubenswrapper[4992]: I0131 09:50:38.843584 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grqfh" event={"ID":"5dd02b27-2769-4767-90e6-4176ddb33d2f","Type":"ContainerDied","Data":"46ebf887c4b501c1728f251a9ea1728c40981f733a284f26f52b5f51351e3df9"} Jan 31 09:50:39 crc kubenswrapper[4992]: I0131 09:50:39.855806 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grqfh" event={"ID":"5dd02b27-2769-4767-90e6-4176ddb33d2f","Type":"ContainerStarted","Data":"2d119c00476c273d3ec0f89491deb737e3b8ce3fe1ce56bea0da4a1f804a55fd"} Jan 31 09:50:39 crc kubenswrapper[4992]: I0131 09:50:39.883355 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-grqfh" podStartSLOduration=2.458747367 podStartE2EDuration="3.883340023s" podCreationTimestamp="2026-01-31 09:50:36 +0000 UTC" firstStartedPulling="2026-01-31 09:50:37.833580709 +0000 UTC m=+1533.804972706" lastFinishedPulling="2026-01-31 09:50:39.258173375 +0000 UTC m=+1535.229565362" observedRunningTime="2026-01-31 09:50:39.878755562 +0000 UTC m=+1535.850147559" watchObservedRunningTime="2026-01-31 09:50:39.883340023 +0000 UTC m=+1535.854732010" Jan 31 09:50:42 crc kubenswrapper[4992]: I0131 09:50:42.783261 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2jztn"] Jan 31 09:50:42 crc kubenswrapper[4992]: I0131 09:50:42.785226 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:42 crc kubenswrapper[4992]: I0131 09:50:42.806236 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2jztn"] Jan 31 09:50:42 crc kubenswrapper[4992]: I0131 09:50:42.921390 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5230c68c-426a-4c35-acec-8abd602a63b2-utilities\") pod \"certified-operators-2jztn\" (UID: \"5230c68c-426a-4c35-acec-8abd602a63b2\") " pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:42 crc kubenswrapper[4992]: I0131 09:50:42.921508 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqs2r\" (UniqueName: \"kubernetes.io/projected/5230c68c-426a-4c35-acec-8abd602a63b2-kube-api-access-jqs2r\") pod \"certified-operators-2jztn\" (UID: \"5230c68c-426a-4c35-acec-8abd602a63b2\") " pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:42 crc kubenswrapper[4992]: I0131 09:50:42.921570 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5230c68c-426a-4c35-acec-8abd602a63b2-catalog-content\") pod \"certified-operators-2jztn\" (UID: \"5230c68c-426a-4c35-acec-8abd602a63b2\") " pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:43 crc kubenswrapper[4992]: I0131 09:50:43.022773 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5230c68c-426a-4c35-acec-8abd602a63b2-catalog-content\") pod \"certified-operators-2jztn\" (UID: \"5230c68c-426a-4c35-acec-8abd602a63b2\") " pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:43 crc kubenswrapper[4992]: I0131 09:50:43.022882 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5230c68c-426a-4c35-acec-8abd602a63b2-utilities\") pod \"certified-operators-2jztn\" (UID: \"5230c68c-426a-4c35-acec-8abd602a63b2\") " pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:43 crc kubenswrapper[4992]: I0131 09:50:43.022935 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqs2r\" (UniqueName: \"kubernetes.io/projected/5230c68c-426a-4c35-acec-8abd602a63b2-kube-api-access-jqs2r\") pod \"certified-operators-2jztn\" (UID: \"5230c68c-426a-4c35-acec-8abd602a63b2\") " pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:43 crc kubenswrapper[4992]: I0131 09:50:43.023472 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5230c68c-426a-4c35-acec-8abd602a63b2-catalog-content\") pod \"certified-operators-2jztn\" (UID: \"5230c68c-426a-4c35-acec-8abd602a63b2\") " pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:43 crc kubenswrapper[4992]: I0131 09:50:43.023528 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5230c68c-426a-4c35-acec-8abd602a63b2-utilities\") pod \"certified-operators-2jztn\" (UID: \"5230c68c-426a-4c35-acec-8abd602a63b2\") " pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:43 crc kubenswrapper[4992]: I0131 09:50:43.057374 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqs2r\" (UniqueName: \"kubernetes.io/projected/5230c68c-426a-4c35-acec-8abd602a63b2-kube-api-access-jqs2r\") pod \"certified-operators-2jztn\" (UID: \"5230c68c-426a-4c35-acec-8abd602a63b2\") " pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:43 crc kubenswrapper[4992]: I0131 09:50:43.107081 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:43 crc kubenswrapper[4992]: I0131 09:50:43.621905 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2jztn"] Jan 31 09:50:43 crc kubenswrapper[4992]: I0131 09:50:43.893333 4992 generic.go:334] "Generic (PLEG): container finished" podID="5230c68c-426a-4c35-acec-8abd602a63b2" containerID="6effd0b42d6a13f0bdbd8e5e66dff2fbe2dd5961255bcd6ddc7bcca40e231fc9" exitCode=0 Jan 31 09:50:43 crc kubenswrapper[4992]: I0131 09:50:43.893457 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jztn" event={"ID":"5230c68c-426a-4c35-acec-8abd602a63b2","Type":"ContainerDied","Data":"6effd0b42d6a13f0bdbd8e5e66dff2fbe2dd5961255bcd6ddc7bcca40e231fc9"} Jan 31 09:50:43 crc kubenswrapper[4992]: I0131 09:50:43.893694 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jztn" event={"ID":"5230c68c-426a-4c35-acec-8abd602a63b2","Type":"ContainerStarted","Data":"1e792021ca94aea8ae55a92f33846acd6f207a527eb892c0682a8bc4c1802ed0"} Jan 31 09:50:44 crc kubenswrapper[4992]: I0131 09:50:44.904561 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jztn" event={"ID":"5230c68c-426a-4c35-acec-8abd602a63b2","Type":"ContainerStarted","Data":"522f7687d5ec0021051114b29b7557819471c9db56c18e87a975f34c6d64c440"} Jan 31 09:50:45 crc kubenswrapper[4992]: I0131 09:50:45.301553 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:50:45 crc kubenswrapper[4992]: I0131 09:50:45.301648 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:50:45 crc kubenswrapper[4992]: I0131 09:50:45.914496 4992 generic.go:334] "Generic (PLEG): container finished" podID="5230c68c-426a-4c35-acec-8abd602a63b2" containerID="522f7687d5ec0021051114b29b7557819471c9db56c18e87a975f34c6d64c440" exitCode=0 Jan 31 09:50:45 crc kubenswrapper[4992]: I0131 09:50:45.914549 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jztn" event={"ID":"5230c68c-426a-4c35-acec-8abd602a63b2","Type":"ContainerDied","Data":"522f7687d5ec0021051114b29b7557819471c9db56c18e87a975f34c6d64c440"} Jan 31 09:50:46 crc kubenswrapper[4992]: I0131 09:50:46.719322 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:46 crc kubenswrapper[4992]: I0131 09:50:46.719463 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:46 crc kubenswrapper[4992]: I0131 09:50:46.784695 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:46 crc kubenswrapper[4992]: I0131 09:50:46.922755 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jztn" event={"ID":"5230c68c-426a-4c35-acec-8abd602a63b2","Type":"ContainerStarted","Data":"5a8c24a47151e19460d9d8e82526694f857cc63296463206c356786e45141965"} Jan 31 09:50:46 crc kubenswrapper[4992]: I0131 09:50:46.948311 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2jztn" podStartSLOduration=2.150005476 podStartE2EDuration="4.948291959s" podCreationTimestamp="2026-01-31 09:50:42 +0000 UTC" firstStartedPulling="2026-01-31 09:50:43.89483563 +0000 UTC m=+1539.866227617" lastFinishedPulling="2026-01-31 09:50:46.693122103 +0000 UTC m=+1542.664514100" observedRunningTime="2026-01-31 09:50:46.938179019 +0000 UTC m=+1542.909571006" watchObservedRunningTime="2026-01-31 09:50:46.948291959 +0000 UTC m=+1542.919683946" Jan 31 09:50:46 crc kubenswrapper[4992]: I0131 09:50:46.973909 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:49 crc kubenswrapper[4992]: I0131 09:50:49.176535 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-grqfh"] Jan 31 09:50:49 crc kubenswrapper[4992]: I0131 09:50:49.946383 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-grqfh" podUID="5dd02b27-2769-4767-90e6-4176ddb33d2f" containerName="registry-server" containerID="cri-o://2d119c00476c273d3ec0f89491deb737e3b8ce3fe1ce56bea0da4a1f804a55fd" gracePeriod=2 Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.415138 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.459847 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd02b27-2769-4767-90e6-4176ddb33d2f-utilities\") pod \"5dd02b27-2769-4767-90e6-4176ddb33d2f\" (UID: \"5dd02b27-2769-4767-90e6-4176ddb33d2f\") " Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.459906 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd02b27-2769-4767-90e6-4176ddb33d2f-catalog-content\") pod \"5dd02b27-2769-4767-90e6-4176ddb33d2f\" (UID: \"5dd02b27-2769-4767-90e6-4176ddb33d2f\") " Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.460240 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfnhn\" (UniqueName: \"kubernetes.io/projected/5dd02b27-2769-4767-90e6-4176ddb33d2f-kube-api-access-vfnhn\") pod \"5dd02b27-2769-4767-90e6-4176ddb33d2f\" (UID: \"5dd02b27-2769-4767-90e6-4176ddb33d2f\") " Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.460740 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd02b27-2769-4767-90e6-4176ddb33d2f-utilities" (OuterVolumeSpecName: "utilities") pod "5dd02b27-2769-4767-90e6-4176ddb33d2f" (UID: "5dd02b27-2769-4767-90e6-4176ddb33d2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.465992 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd02b27-2769-4767-90e6-4176ddb33d2f-kube-api-access-vfnhn" (OuterVolumeSpecName: "kube-api-access-vfnhn") pod "5dd02b27-2769-4767-90e6-4176ddb33d2f" (UID: "5dd02b27-2769-4767-90e6-4176ddb33d2f"). InnerVolumeSpecName "kube-api-access-vfnhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.480056 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd02b27-2769-4767-90e6-4176ddb33d2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5dd02b27-2769-4767-90e6-4176ddb33d2f" (UID: "5dd02b27-2769-4767-90e6-4176ddb33d2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.563074 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd02b27-2769-4767-90e6-4176ddb33d2f-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.563104 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd02b27-2769-4767-90e6-4176ddb33d2f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.563115 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfnhn\" (UniqueName: \"kubernetes.io/projected/5dd02b27-2769-4767-90e6-4176ddb33d2f-kube-api-access-vfnhn\") on node \"crc\" DevicePath \"\"" Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.956519 4992 generic.go:334] "Generic (PLEG): container finished" podID="5dd02b27-2769-4767-90e6-4176ddb33d2f" containerID="2d119c00476c273d3ec0f89491deb737e3b8ce3fe1ce56bea0da4a1f804a55fd" exitCode=0 Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.956563 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grqfh" event={"ID":"5dd02b27-2769-4767-90e6-4176ddb33d2f","Type":"ContainerDied","Data":"2d119c00476c273d3ec0f89491deb737e3b8ce3fe1ce56bea0da4a1f804a55fd"} Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.956583 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-grqfh" Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.956610 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-grqfh" event={"ID":"5dd02b27-2769-4767-90e6-4176ddb33d2f","Type":"ContainerDied","Data":"9fccbf929978b53da6851aa432fc90a58ae2cb8ac8e76dccc98c4cacaf46f16c"} Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.956632 4992 scope.go:117] "RemoveContainer" containerID="2d119c00476c273d3ec0f89491deb737e3b8ce3fe1ce56bea0da4a1f804a55fd" Jan 31 09:50:50 crc kubenswrapper[4992]: I0131 09:50:50.991210 4992 scope.go:117] "RemoveContainer" containerID="46ebf887c4b501c1728f251a9ea1728c40981f733a284f26f52b5f51351e3df9" Jan 31 09:50:51 crc kubenswrapper[4992]: I0131 09:50:51.000062 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-grqfh"] Jan 31 09:50:51 crc kubenswrapper[4992]: I0131 09:50:51.010307 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-grqfh"] Jan 31 09:50:51 crc kubenswrapper[4992]: I0131 09:50:51.033510 4992 scope.go:117] "RemoveContainer" containerID="bd5e91281d4ae9b28d902ad8cd2380bc2688e727bc2431fbd70c35748a4d2d70" Jan 31 09:50:51 crc kubenswrapper[4992]: I0131 09:50:51.056374 4992 scope.go:117] "RemoveContainer" containerID="2d119c00476c273d3ec0f89491deb737e3b8ce3fe1ce56bea0da4a1f804a55fd" Jan 31 09:50:51 crc kubenswrapper[4992]: E0131 09:50:51.056843 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d119c00476c273d3ec0f89491deb737e3b8ce3fe1ce56bea0da4a1f804a55fd\": container with ID starting with 2d119c00476c273d3ec0f89491deb737e3b8ce3fe1ce56bea0da4a1f804a55fd not found: ID does not exist" containerID="2d119c00476c273d3ec0f89491deb737e3b8ce3fe1ce56bea0da4a1f804a55fd" Jan 31 09:50:51 crc kubenswrapper[4992]: I0131 09:50:51.056892 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d119c00476c273d3ec0f89491deb737e3b8ce3fe1ce56bea0da4a1f804a55fd"} err="failed to get container status \"2d119c00476c273d3ec0f89491deb737e3b8ce3fe1ce56bea0da4a1f804a55fd\": rpc error: code = NotFound desc = could not find container \"2d119c00476c273d3ec0f89491deb737e3b8ce3fe1ce56bea0da4a1f804a55fd\": container with ID starting with 2d119c00476c273d3ec0f89491deb737e3b8ce3fe1ce56bea0da4a1f804a55fd not found: ID does not exist" Jan 31 09:50:51 crc kubenswrapper[4992]: I0131 09:50:51.056922 4992 scope.go:117] "RemoveContainer" containerID="46ebf887c4b501c1728f251a9ea1728c40981f733a284f26f52b5f51351e3df9" Jan 31 09:50:51 crc kubenswrapper[4992]: E0131 09:50:51.057339 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46ebf887c4b501c1728f251a9ea1728c40981f733a284f26f52b5f51351e3df9\": container with ID starting with 46ebf887c4b501c1728f251a9ea1728c40981f733a284f26f52b5f51351e3df9 not found: ID does not exist" containerID="46ebf887c4b501c1728f251a9ea1728c40981f733a284f26f52b5f51351e3df9" Jan 31 09:50:51 crc kubenswrapper[4992]: I0131 09:50:51.057392 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46ebf887c4b501c1728f251a9ea1728c40981f733a284f26f52b5f51351e3df9"} err="failed to get container status \"46ebf887c4b501c1728f251a9ea1728c40981f733a284f26f52b5f51351e3df9\": rpc error: code = NotFound desc = could not find container \"46ebf887c4b501c1728f251a9ea1728c40981f733a284f26f52b5f51351e3df9\": container with ID starting with 46ebf887c4b501c1728f251a9ea1728c40981f733a284f26f52b5f51351e3df9 not found: ID does not exist" Jan 31 09:50:51 crc kubenswrapper[4992]: I0131 09:50:51.057466 4992 scope.go:117] "RemoveContainer" containerID="bd5e91281d4ae9b28d902ad8cd2380bc2688e727bc2431fbd70c35748a4d2d70" Jan 31 09:50:51 crc kubenswrapper[4992]: E0131 09:50:51.057825 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5e91281d4ae9b28d902ad8cd2380bc2688e727bc2431fbd70c35748a4d2d70\": container with ID starting with bd5e91281d4ae9b28d902ad8cd2380bc2688e727bc2431fbd70c35748a4d2d70 not found: ID does not exist" containerID="bd5e91281d4ae9b28d902ad8cd2380bc2688e727bc2431fbd70c35748a4d2d70" Jan 31 09:50:51 crc kubenswrapper[4992]: I0131 09:50:51.057862 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5e91281d4ae9b28d902ad8cd2380bc2688e727bc2431fbd70c35748a4d2d70"} err="failed to get container status \"bd5e91281d4ae9b28d902ad8cd2380bc2688e727bc2431fbd70c35748a4d2d70\": rpc error: code = NotFound desc = could not find container \"bd5e91281d4ae9b28d902ad8cd2380bc2688e727bc2431fbd70c35748a4d2d70\": container with ID starting with bd5e91281d4ae9b28d902ad8cd2380bc2688e727bc2431fbd70c35748a4d2d70 not found: ID does not exist" Jan 31 09:50:51 crc kubenswrapper[4992]: I0131 09:50:51.193721 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd02b27-2769-4767-90e6-4176ddb33d2f" path="/var/lib/kubelet/pods/5dd02b27-2769-4767-90e6-4176ddb33d2f/volumes" Jan 31 09:50:53 crc kubenswrapper[4992]: I0131 09:50:53.107589 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:53 crc kubenswrapper[4992]: I0131 09:50:53.108772 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:53 crc kubenswrapper[4992]: I0131 09:50:53.159106 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:54 crc kubenswrapper[4992]: I0131 09:50:54.059491 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:55 crc kubenswrapper[4992]: I0131 09:50:55.191290 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2jztn"] Jan 31 09:50:57 crc kubenswrapper[4992]: I0131 09:50:57.026670 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2jztn" podUID="5230c68c-426a-4c35-acec-8abd602a63b2" containerName="registry-server" containerID="cri-o://5a8c24a47151e19460d9d8e82526694f857cc63296463206c356786e45141965" gracePeriod=2 Jan 31 09:50:57 crc kubenswrapper[4992]: I0131 09:50:57.444735 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:57 crc kubenswrapper[4992]: I0131 09:50:57.507181 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5230c68c-426a-4c35-acec-8abd602a63b2-utilities\") pod \"5230c68c-426a-4c35-acec-8abd602a63b2\" (UID: \"5230c68c-426a-4c35-acec-8abd602a63b2\") " Jan 31 09:50:57 crc kubenswrapper[4992]: I0131 09:50:57.507406 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5230c68c-426a-4c35-acec-8abd602a63b2-catalog-content\") pod \"5230c68c-426a-4c35-acec-8abd602a63b2\" (UID: \"5230c68c-426a-4c35-acec-8abd602a63b2\") " Jan 31 09:50:57 crc kubenswrapper[4992]: I0131 09:50:57.507530 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqs2r\" (UniqueName: \"kubernetes.io/projected/5230c68c-426a-4c35-acec-8abd602a63b2-kube-api-access-jqs2r\") pod \"5230c68c-426a-4c35-acec-8abd602a63b2\" (UID: \"5230c68c-426a-4c35-acec-8abd602a63b2\") " Jan 31 09:50:57 crc kubenswrapper[4992]: I0131 09:50:57.508071 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5230c68c-426a-4c35-acec-8abd602a63b2-utilities" (OuterVolumeSpecName: "utilities") pod "5230c68c-426a-4c35-acec-8abd602a63b2" (UID: "5230c68c-426a-4c35-acec-8abd602a63b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:50:57 crc kubenswrapper[4992]: I0131 09:50:57.516553 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5230c68c-426a-4c35-acec-8abd602a63b2-kube-api-access-jqs2r" (OuterVolumeSpecName: "kube-api-access-jqs2r") pod "5230c68c-426a-4c35-acec-8abd602a63b2" (UID: "5230c68c-426a-4c35-acec-8abd602a63b2"). InnerVolumeSpecName "kube-api-access-jqs2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:50:57 crc kubenswrapper[4992]: I0131 09:50:57.556287 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5230c68c-426a-4c35-acec-8abd602a63b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5230c68c-426a-4c35-acec-8abd602a63b2" (UID: "5230c68c-426a-4c35-acec-8abd602a63b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:50:57 crc kubenswrapper[4992]: I0131 09:50:57.609497 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqs2r\" (UniqueName: \"kubernetes.io/projected/5230c68c-426a-4c35-acec-8abd602a63b2-kube-api-access-jqs2r\") on node \"crc\" DevicePath \"\"" Jan 31 09:50:57 crc kubenswrapper[4992]: I0131 09:50:57.609553 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5230c68c-426a-4c35-acec-8abd602a63b2-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:50:57 crc kubenswrapper[4992]: I0131 09:50:57.609573 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5230c68c-426a-4c35-acec-8abd602a63b2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:50:58 crc kubenswrapper[4992]: I0131 09:50:58.042532 4992 generic.go:334] "Generic (PLEG): container finished" podID="5230c68c-426a-4c35-acec-8abd602a63b2" containerID="5a8c24a47151e19460d9d8e82526694f857cc63296463206c356786e45141965" exitCode=0 Jan 31 09:50:58 crc kubenswrapper[4992]: I0131 09:50:58.042577 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jztn" event={"ID":"5230c68c-426a-4c35-acec-8abd602a63b2","Type":"ContainerDied","Data":"5a8c24a47151e19460d9d8e82526694f857cc63296463206c356786e45141965"} Jan 31 09:50:58 crc kubenswrapper[4992]: I0131 09:50:58.042602 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jztn" event={"ID":"5230c68c-426a-4c35-acec-8abd602a63b2","Type":"ContainerDied","Data":"1e792021ca94aea8ae55a92f33846acd6f207a527eb892c0682a8bc4c1802ed0"} Jan 31 09:50:58 crc kubenswrapper[4992]: I0131 09:50:58.042624 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2jztn" Jan 31 09:50:58 crc kubenswrapper[4992]: I0131 09:50:58.042625 4992 scope.go:117] "RemoveContainer" containerID="5a8c24a47151e19460d9d8e82526694f857cc63296463206c356786e45141965" Jan 31 09:50:58 crc kubenswrapper[4992]: I0131 09:50:58.094654 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2jztn"] Jan 31 09:50:58 crc kubenswrapper[4992]: I0131 09:50:58.098027 4992 scope.go:117] "RemoveContainer" containerID="522f7687d5ec0021051114b29b7557819471c9db56c18e87a975f34c6d64c440" Jan 31 09:50:58 crc kubenswrapper[4992]: I0131 09:50:58.104130 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2jztn"] Jan 31 09:50:58 crc kubenswrapper[4992]: I0131 09:50:58.119798 4992 scope.go:117] "RemoveContainer" containerID="6effd0b42d6a13f0bdbd8e5e66dff2fbe2dd5961255bcd6ddc7bcca40e231fc9" Jan 31 09:50:58 crc kubenswrapper[4992]: I0131 09:50:58.164874 4992 scope.go:117] "RemoveContainer" containerID="5a8c24a47151e19460d9d8e82526694f857cc63296463206c356786e45141965" Jan 31 09:50:58 crc kubenswrapper[4992]: E0131 09:50:58.165555 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a8c24a47151e19460d9d8e82526694f857cc63296463206c356786e45141965\": container with ID starting with 5a8c24a47151e19460d9d8e82526694f857cc63296463206c356786e45141965 not found: ID does not exist" containerID="5a8c24a47151e19460d9d8e82526694f857cc63296463206c356786e45141965" Jan 31 09:50:58 crc kubenswrapper[4992]: I0131 09:50:58.165607 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a8c24a47151e19460d9d8e82526694f857cc63296463206c356786e45141965"} err="failed to get container status \"5a8c24a47151e19460d9d8e82526694f857cc63296463206c356786e45141965\": rpc error: code = NotFound desc = could not find container \"5a8c24a47151e19460d9d8e82526694f857cc63296463206c356786e45141965\": container with ID starting with 5a8c24a47151e19460d9d8e82526694f857cc63296463206c356786e45141965 not found: ID does not exist" Jan 31 09:50:58 crc kubenswrapper[4992]: I0131 09:50:58.165645 4992 scope.go:117] "RemoveContainer" containerID="522f7687d5ec0021051114b29b7557819471c9db56c18e87a975f34c6d64c440" Jan 31 09:50:58 crc kubenswrapper[4992]: E0131 09:50:58.166064 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"522f7687d5ec0021051114b29b7557819471c9db56c18e87a975f34c6d64c440\": container with ID starting with 522f7687d5ec0021051114b29b7557819471c9db56c18e87a975f34c6d64c440 not found: ID does not exist" containerID="522f7687d5ec0021051114b29b7557819471c9db56c18e87a975f34c6d64c440" Jan 31 09:50:58 crc kubenswrapper[4992]: I0131 09:50:58.166115 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"522f7687d5ec0021051114b29b7557819471c9db56c18e87a975f34c6d64c440"} err="failed to get container status \"522f7687d5ec0021051114b29b7557819471c9db56c18e87a975f34c6d64c440\": rpc error: code = NotFound desc = could not find container \"522f7687d5ec0021051114b29b7557819471c9db56c18e87a975f34c6d64c440\": container with ID starting with 522f7687d5ec0021051114b29b7557819471c9db56c18e87a975f34c6d64c440 not found: ID does not exist" Jan 31 09:50:58 crc kubenswrapper[4992]: I0131 09:50:58.166145 4992 scope.go:117] "RemoveContainer" containerID="6effd0b42d6a13f0bdbd8e5e66dff2fbe2dd5961255bcd6ddc7bcca40e231fc9" Jan 31 09:50:58 crc kubenswrapper[4992]: E0131 09:50:58.166615 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6effd0b42d6a13f0bdbd8e5e66dff2fbe2dd5961255bcd6ddc7bcca40e231fc9\": container with ID starting with 6effd0b42d6a13f0bdbd8e5e66dff2fbe2dd5961255bcd6ddc7bcca40e231fc9 not found: ID does not exist" containerID="6effd0b42d6a13f0bdbd8e5e66dff2fbe2dd5961255bcd6ddc7bcca40e231fc9" Jan 31 09:50:58 crc kubenswrapper[4992]: I0131 09:50:58.166648 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6effd0b42d6a13f0bdbd8e5e66dff2fbe2dd5961255bcd6ddc7bcca40e231fc9"} err="failed to get container status \"6effd0b42d6a13f0bdbd8e5e66dff2fbe2dd5961255bcd6ddc7bcca40e231fc9\": rpc error: code = NotFound desc = could not find container \"6effd0b42d6a13f0bdbd8e5e66dff2fbe2dd5961255bcd6ddc7bcca40e231fc9\": container with ID starting with 6effd0b42d6a13f0bdbd8e5e66dff2fbe2dd5961255bcd6ddc7bcca40e231fc9 not found: ID does not exist" Jan 31 09:50:59 crc kubenswrapper[4992]: I0131 09:50:59.191646 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5230c68c-426a-4c35-acec-8abd602a63b2" path="/var/lib/kubelet/pods/5230c68c-426a-4c35-acec-8abd602a63b2/volumes" Jan 31 09:51:15 crc kubenswrapper[4992]: I0131 09:51:15.301277 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:51:15 crc kubenswrapper[4992]: I0131 09:51:15.301912 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:51:15 crc kubenswrapper[4992]: I0131 09:51:15.832258 4992 scope.go:117] "RemoveContainer" containerID="2a8594eb86aee7b4010ece44360e831016a772f6c8b4c3784e6e296719b93825" Jan 31 09:51:15 crc kubenswrapper[4992]: I0131 09:51:15.864785 4992 scope.go:117] "RemoveContainer" containerID="064ddcb62dd845716362c318734fec5d6c84d2b568a60bdf0ea5cb9f3c22b2f4" Jan 31 09:51:15 crc kubenswrapper[4992]: I0131 09:51:15.914156 4992 scope.go:117] "RemoveContainer" containerID="c1bf1f06d94c1a57ef50cf1a6f46f96d135d9ed79f0f3b54ba6260b166312c71" Jan 31 09:51:15 crc kubenswrapper[4992]: I0131 09:51:15.957100 4992 scope.go:117] "RemoveContainer" containerID="83a051c833ae7061f437c780e99f141c06381328067fbd1215387053df7ec005" Jan 31 09:51:45 crc kubenswrapper[4992]: I0131 09:51:45.301786 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:51:45 crc kubenswrapper[4992]: I0131 09:51:45.302387 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:51:45 crc kubenswrapper[4992]: I0131 09:51:45.302473 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 09:51:45 crc kubenswrapper[4992]: I0131 09:51:45.303350 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:51:45 crc kubenswrapper[4992]: I0131 09:51:45.303470 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" gracePeriod=600 Jan 31 09:51:45 crc kubenswrapper[4992]: E0131 09:51:45.425994 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:51:45 crc kubenswrapper[4992]: I0131 09:51:45.490514 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" exitCode=0 Jan 31 09:51:45 crc kubenswrapper[4992]: I0131 09:51:45.490566 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9"} Jan 31 09:51:45 crc kubenswrapper[4992]: I0131 09:51:45.490598 4992 scope.go:117] "RemoveContainer" containerID="85b7e8954b104f8b7761c24a9e9d822599579a66efc36412ab6a9d3f1890fe38" Jan 31 09:51:45 crc kubenswrapper[4992]: I0131 09:51:45.491572 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:51:45 crc kubenswrapper[4992]: E0131 09:51:45.492058 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:51:56 crc kubenswrapper[4992]: I0131 09:51:56.182859 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:51:56 crc kubenswrapper[4992]: E0131 09:51:56.183701 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:52:11 crc kubenswrapper[4992]: I0131 09:52:11.182837 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:52:11 crc kubenswrapper[4992]: E0131 09:52:11.183496 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:52:16 crc kubenswrapper[4992]: I0131 09:52:16.046684 4992 scope.go:117] "RemoveContainer" containerID="d5505953e505e2daa4904c4a87c2192e7dada42b708ee67c12c721cb3675d953" Jan 31 09:52:26 crc kubenswrapper[4992]: I0131 09:52:26.182945 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:52:26 crc kubenswrapper[4992]: E0131 09:52:26.183765 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:52:38 crc kubenswrapper[4992]: I0131 09:52:38.182767 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:52:38 crc kubenswrapper[4992]: E0131 09:52:38.183470 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:52:46 crc kubenswrapper[4992]: I0131 09:52:46.041074 4992 generic.go:334] "Generic (PLEG): container finished" podID="1e3d4682-1214-4cda-a6d4-07bd6fe3b816" containerID="cad65664830e188a35e69070e0221ac64d8d22028229b989a10089f47b3d9b86" exitCode=0 Jan 31 09:52:46 crc kubenswrapper[4992]: I0131 09:52:46.041178 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" event={"ID":"1e3d4682-1214-4cda-a6d4-07bd6fe3b816","Type":"ContainerDied","Data":"cad65664830e188a35e69070e0221ac64d8d22028229b989a10089f47b3d9b86"} Jan 31 09:52:47 crc kubenswrapper[4992]: I0131 09:52:47.526523 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:52:47 crc kubenswrapper[4992]: I0131 09:52:47.619271 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-ssh-key-openstack-edpm-ipam\") pod \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " Jan 31 09:52:47 crc kubenswrapper[4992]: I0131 09:52:47.619357 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-inventory\") pod \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " Jan 31 09:52:47 crc kubenswrapper[4992]: I0131 09:52:47.619404 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx9zw\" (UniqueName: \"kubernetes.io/projected/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-kube-api-access-kx9zw\") pod \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " Jan 31 09:52:47 crc kubenswrapper[4992]: I0131 09:52:47.619451 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-bootstrap-combined-ca-bundle\") pod \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\" (UID: \"1e3d4682-1214-4cda-a6d4-07bd6fe3b816\") " Jan 31 09:52:47 crc kubenswrapper[4992]: I0131 09:52:47.624322 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1e3d4682-1214-4cda-a6d4-07bd6fe3b816" (UID: "1e3d4682-1214-4cda-a6d4-07bd6fe3b816"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:52:47 crc kubenswrapper[4992]: I0131 09:52:47.628718 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-kube-api-access-kx9zw" (OuterVolumeSpecName: "kube-api-access-kx9zw") pod "1e3d4682-1214-4cda-a6d4-07bd6fe3b816" (UID: "1e3d4682-1214-4cda-a6d4-07bd6fe3b816"). InnerVolumeSpecName "kube-api-access-kx9zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:52:47 crc kubenswrapper[4992]: I0131 09:52:47.642975 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-inventory" (OuterVolumeSpecName: "inventory") pod "1e3d4682-1214-4cda-a6d4-07bd6fe3b816" (UID: "1e3d4682-1214-4cda-a6d4-07bd6fe3b816"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:52:47 crc kubenswrapper[4992]: I0131 09:52:47.644594 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1e3d4682-1214-4cda-a6d4-07bd6fe3b816" (UID: "1e3d4682-1214-4cda-a6d4-07bd6fe3b816"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:52:47 crc kubenswrapper[4992]: I0131 09:52:47.721800 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:52:47 crc kubenswrapper[4992]: I0131 09:52:47.721849 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:52:47 crc kubenswrapper[4992]: I0131 09:52:47.721859 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx9zw\" (UniqueName: \"kubernetes.io/projected/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-kube-api-access-kx9zw\") on node \"crc\" DevicePath \"\"" Jan 31 09:52:47 crc kubenswrapper[4992]: I0131 09:52:47.721868 4992 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e3d4682-1214-4cda-a6d4-07bd6fe3b816-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.060322 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" event={"ID":"1e3d4682-1214-4cda-a6d4-07bd6fe3b816","Type":"ContainerDied","Data":"93057734ded6e23300e39cde98ce930c35c22c624135ea67bc522b8195de6fc1"} Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.060383 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93057734ded6e23300e39cde98ce930c35c22c624135ea67bc522b8195de6fc1" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.060413 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.169710 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8"] Jan 31 09:52:48 crc kubenswrapper[4992]: E0131 09:52:48.170156 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e3d4682-1214-4cda-a6d4-07bd6fe3b816" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.170175 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e3d4682-1214-4cda-a6d4-07bd6fe3b816" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 31 09:52:48 crc kubenswrapper[4992]: E0131 09:52:48.170201 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd02b27-2769-4767-90e6-4176ddb33d2f" containerName="extract-utilities" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.170209 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd02b27-2769-4767-90e6-4176ddb33d2f" containerName="extract-utilities" Jan 31 09:52:48 crc kubenswrapper[4992]: E0131 09:52:48.170225 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5230c68c-426a-4c35-acec-8abd602a63b2" containerName="extract-utilities" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.170233 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5230c68c-426a-4c35-acec-8abd602a63b2" containerName="extract-utilities" Jan 31 09:52:48 crc kubenswrapper[4992]: E0131 09:52:48.170245 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5230c68c-426a-4c35-acec-8abd602a63b2" containerName="registry-server" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.170253 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5230c68c-426a-4c35-acec-8abd602a63b2" containerName="registry-server" Jan 31 09:52:48 crc kubenswrapper[4992]: E0131 09:52:48.170268 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd02b27-2769-4767-90e6-4176ddb33d2f" containerName="extract-content" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.170275 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd02b27-2769-4767-90e6-4176ddb33d2f" containerName="extract-content" Jan 31 09:52:48 crc kubenswrapper[4992]: E0131 09:52:48.170295 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd02b27-2769-4767-90e6-4176ddb33d2f" containerName="registry-server" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.170303 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd02b27-2769-4767-90e6-4176ddb33d2f" containerName="registry-server" Jan 31 09:52:48 crc kubenswrapper[4992]: E0131 09:52:48.170320 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5230c68c-426a-4c35-acec-8abd602a63b2" containerName="extract-content" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.170329 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5230c68c-426a-4c35-acec-8abd602a63b2" containerName="extract-content" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.171705 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e3d4682-1214-4cda-a6d4-07bd6fe3b816" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.171733 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="5230c68c-426a-4c35-acec-8abd602a63b2" containerName="registry-server" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.171747 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd02b27-2769-4767-90e6-4176ddb33d2f" containerName="registry-server" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.172468 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.178912 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.182141 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.188228 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.208847 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.229578 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8"] Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.336623 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/595068b2-328c-46b5-b5b1-da4d34af14b2-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-69gz8\" (UID: \"595068b2-328c-46b5-b5b1-da4d34af14b2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.337907 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/595068b2-328c-46b5-b5b1-da4d34af14b2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-69gz8\" (UID: \"595068b2-328c-46b5-b5b1-da4d34af14b2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.338019 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5pnq\" (UniqueName: \"kubernetes.io/projected/595068b2-328c-46b5-b5b1-da4d34af14b2-kube-api-access-t5pnq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-69gz8\" (UID: \"595068b2-328c-46b5-b5b1-da4d34af14b2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.439331 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/595068b2-328c-46b5-b5b1-da4d34af14b2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-69gz8\" (UID: \"595068b2-328c-46b5-b5b1-da4d34af14b2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.439439 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5pnq\" (UniqueName: \"kubernetes.io/projected/595068b2-328c-46b5-b5b1-da4d34af14b2-kube-api-access-t5pnq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-69gz8\" (UID: \"595068b2-328c-46b5-b5b1-da4d34af14b2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.439546 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/595068b2-328c-46b5-b5b1-da4d34af14b2-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-69gz8\" (UID: \"595068b2-328c-46b5-b5b1-da4d34af14b2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.448246 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/595068b2-328c-46b5-b5b1-da4d34af14b2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-69gz8\" (UID: \"595068b2-328c-46b5-b5b1-da4d34af14b2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.450057 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/595068b2-328c-46b5-b5b1-da4d34af14b2-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-69gz8\" (UID: \"595068b2-328c-46b5-b5b1-da4d34af14b2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.474161 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5pnq\" (UniqueName: \"kubernetes.io/projected/595068b2-328c-46b5-b5b1-da4d34af14b2-kube-api-access-t5pnq\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-69gz8\" (UID: \"595068b2-328c-46b5-b5b1-da4d34af14b2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.512236 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" Jan 31 09:52:48 crc kubenswrapper[4992]: I0131 09:52:48.886876 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8"] Jan 31 09:52:49 crc kubenswrapper[4992]: I0131 09:52:49.097297 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" event={"ID":"595068b2-328c-46b5-b5b1-da4d34af14b2","Type":"ContainerStarted","Data":"160dcd43c88926e0baa20a26bbf9b79fcfd74e03bd903961432f2810c8251165"} Jan 31 09:52:52 crc kubenswrapper[4992]: I0131 09:52:52.141147 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" event={"ID":"595068b2-328c-46b5-b5b1-da4d34af14b2","Type":"ContainerStarted","Data":"4008cb617e1336397c426ef5b438104a645b45b525edeb75c7e58f2345165c77"} Jan 31 09:52:52 crc kubenswrapper[4992]: I0131 09:52:52.166530 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" podStartSLOduration=2.017226213 podStartE2EDuration="4.166503046s" podCreationTimestamp="2026-01-31 09:52:48 +0000 UTC" firstStartedPulling="2026-01-31 09:52:48.901437999 +0000 UTC m=+1664.872829986" lastFinishedPulling="2026-01-31 09:52:51.050714802 +0000 UTC m=+1667.022106819" observedRunningTime="2026-01-31 09:52:52.156937821 +0000 UTC m=+1668.128329908" watchObservedRunningTime="2026-01-31 09:52:52.166503046 +0000 UTC m=+1668.137895033" Jan 31 09:52:52 crc kubenswrapper[4992]: I0131 09:52:52.183386 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:52:52 crc kubenswrapper[4992]: E0131 09:52:52.183660 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:53:05 crc kubenswrapper[4992]: I0131 09:53:05.198827 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:53:05 crc kubenswrapper[4992]: E0131 09:53:05.200277 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:53:16 crc kubenswrapper[4992]: I0131 09:53:16.111267 4992 scope.go:117] "RemoveContainer" containerID="5bf46410ead28306d3de3f4a39baf16883fa3361b347eaa300a57686e337170f" Jan 31 09:53:20 crc kubenswrapper[4992]: I0131 09:53:20.183090 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:53:20 crc kubenswrapper[4992]: E0131 09:53:20.184144 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:53:34 crc kubenswrapper[4992]: I0131 09:53:34.045405 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9b8a-account-create-update-vk6sb"] Jan 31 09:53:34 crc kubenswrapper[4992]: I0131 09:53:34.058804 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-n2gjj"] Jan 31 09:53:34 crc kubenswrapper[4992]: I0131 09:53:34.065898 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-b42h5"] Jan 31 09:53:34 crc kubenswrapper[4992]: I0131 09:53:34.073650 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-n2gjj"] Jan 31 09:53:34 crc kubenswrapper[4992]: I0131 09:53:34.081112 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9b8a-account-create-update-vk6sb"] Jan 31 09:53:34 crc kubenswrapper[4992]: I0131 09:53:34.088531 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-b42h5"] Jan 31 09:53:34 crc kubenswrapper[4992]: I0131 09:53:34.182616 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:53:34 crc kubenswrapper[4992]: E0131 09:53:34.182934 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:53:35 crc kubenswrapper[4992]: I0131 09:53:35.030658 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7d73-account-create-update-tl9tj"] Jan 31 09:53:35 crc kubenswrapper[4992]: I0131 09:53:35.039312 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mq26l"] Jan 31 09:53:35 crc kubenswrapper[4992]: I0131 09:53:35.049530 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c240-account-create-update-jfkt9"] Jan 31 09:53:35 crc kubenswrapper[4992]: I0131 09:53:35.056894 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7d73-account-create-update-tl9tj"] Jan 31 09:53:35 crc kubenswrapper[4992]: I0131 09:53:35.064996 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c240-account-create-update-jfkt9"] Jan 31 09:53:35 crc kubenswrapper[4992]: I0131 09:53:35.078928 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mq26l"] Jan 31 09:53:35 crc kubenswrapper[4992]: I0131 09:53:35.195075 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09c89abf-868c-4405-8b61-c714b4f0a2fc" path="/var/lib/kubelet/pods/09c89abf-868c-4405-8b61-c714b4f0a2fc/volumes" Jan 31 09:53:35 crc kubenswrapper[4992]: I0131 09:53:35.195678 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ceda8d2-7468-42f8-beb8-bf2ac95bea0b" path="/var/lib/kubelet/pods/2ceda8d2-7468-42f8-beb8-bf2ac95bea0b/volumes" Jan 31 09:53:35 crc kubenswrapper[4992]: I0131 09:53:35.196161 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="579e533f-ff8a-477d-b7ca-99835fec403c" path="/var/lib/kubelet/pods/579e533f-ff8a-477d-b7ca-99835fec403c/volumes" Jan 31 09:53:35 crc kubenswrapper[4992]: I0131 09:53:35.196738 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5afd2ddf-cdf8-4a71-862f-b22cceae2852" path="/var/lib/kubelet/pods/5afd2ddf-cdf8-4a71-862f-b22cceae2852/volumes" Jan 31 09:53:35 crc kubenswrapper[4992]: I0131 09:53:35.197674 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f670d9f-8208-4b3d-b7f8-902b28d63375" path="/var/lib/kubelet/pods/6f670d9f-8208-4b3d-b7f8-902b28d63375/volumes" Jan 31 09:53:35 crc kubenswrapper[4992]: I0131 09:53:35.198610 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f89a5da-e099-4e2f-a95b-fdc648424a96" path="/var/lib/kubelet/pods/7f89a5da-e099-4e2f-a95b-fdc648424a96/volumes" Jan 31 09:53:49 crc kubenswrapper[4992]: I0131 09:53:49.183018 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:53:49 crc kubenswrapper[4992]: E0131 09:53:49.183909 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:53:50 crc kubenswrapper[4992]: I0131 09:53:50.029072 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-mhq92"] Jan 31 09:53:50 crc kubenswrapper[4992]: I0131 09:53:50.036626 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-mhq92"] Jan 31 09:53:51 crc kubenswrapper[4992]: I0131 09:53:51.194094 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a74fed1c-c0a3-4120-9724-50d5000661bb" path="/var/lib/kubelet/pods/a74fed1c-c0a3-4120-9724-50d5000661bb/volumes" Jan 31 09:54:00 crc kubenswrapper[4992]: I0131 09:54:00.054207 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-96xss"] Jan 31 09:54:00 crc kubenswrapper[4992]: I0131 09:54:00.065263 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-96xss"] Jan 31 09:54:00 crc kubenswrapper[4992]: I0131 09:54:00.798346 4992 generic.go:334] "Generic (PLEG): container finished" podID="595068b2-328c-46b5-b5b1-da4d34af14b2" containerID="4008cb617e1336397c426ef5b438104a645b45b525edeb75c7e58f2345165c77" exitCode=0 Jan 31 09:54:00 crc kubenswrapper[4992]: I0131 09:54:00.798396 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" event={"ID":"595068b2-328c-46b5-b5b1-da4d34af14b2","Type":"ContainerDied","Data":"4008cb617e1336397c426ef5b438104a645b45b525edeb75c7e58f2345165c77"} Jan 31 09:54:01 crc kubenswrapper[4992]: I0131 09:54:01.183356 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:54:01 crc kubenswrapper[4992]: E0131 09:54:01.183644 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:54:01 crc kubenswrapper[4992]: I0131 09:54:01.202882 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc5e7d3d-d54b-4b87-8578-34d1764e7e0d" path="/var/lib/kubelet/pods/fc5e7d3d-d54b-4b87-8578-34d1764e7e0d/volumes" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.269486 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.388155 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5pnq\" (UniqueName: \"kubernetes.io/projected/595068b2-328c-46b5-b5b1-da4d34af14b2-kube-api-access-t5pnq\") pod \"595068b2-328c-46b5-b5b1-da4d34af14b2\" (UID: \"595068b2-328c-46b5-b5b1-da4d34af14b2\") " Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.388520 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/595068b2-328c-46b5-b5b1-da4d34af14b2-ssh-key-openstack-edpm-ipam\") pod \"595068b2-328c-46b5-b5b1-da4d34af14b2\" (UID: \"595068b2-328c-46b5-b5b1-da4d34af14b2\") " Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.388739 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/595068b2-328c-46b5-b5b1-da4d34af14b2-inventory\") pod \"595068b2-328c-46b5-b5b1-da4d34af14b2\" (UID: \"595068b2-328c-46b5-b5b1-da4d34af14b2\") " Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.395604 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/595068b2-328c-46b5-b5b1-da4d34af14b2-kube-api-access-t5pnq" (OuterVolumeSpecName: "kube-api-access-t5pnq") pod "595068b2-328c-46b5-b5b1-da4d34af14b2" (UID: "595068b2-328c-46b5-b5b1-da4d34af14b2"). InnerVolumeSpecName "kube-api-access-t5pnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.412663 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/595068b2-328c-46b5-b5b1-da4d34af14b2-inventory" (OuterVolumeSpecName: "inventory") pod "595068b2-328c-46b5-b5b1-da4d34af14b2" (UID: "595068b2-328c-46b5-b5b1-da4d34af14b2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.417163 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/595068b2-328c-46b5-b5b1-da4d34af14b2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "595068b2-328c-46b5-b5b1-da4d34af14b2" (UID: "595068b2-328c-46b5-b5b1-da4d34af14b2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.490467 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/595068b2-328c-46b5-b5b1-da4d34af14b2-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.490495 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5pnq\" (UniqueName: \"kubernetes.io/projected/595068b2-328c-46b5-b5b1-da4d34af14b2-kube-api-access-t5pnq\") on node \"crc\" DevicePath \"\"" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.490507 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/595068b2-328c-46b5-b5b1-da4d34af14b2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.823381 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" event={"ID":"595068b2-328c-46b5-b5b1-da4d34af14b2","Type":"ContainerDied","Data":"160dcd43c88926e0baa20a26bbf9b79fcfd74e03bd903961432f2810c8251165"} Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.823455 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="160dcd43c88926e0baa20a26bbf9b79fcfd74e03bd903961432f2810c8251165" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.823474 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.925478 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8"] Jan 31 09:54:02 crc kubenswrapper[4992]: E0131 09:54:02.925973 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="595068b2-328c-46b5-b5b1-da4d34af14b2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.925996 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="595068b2-328c-46b5-b5b1-da4d34af14b2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.926261 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="595068b2-328c-46b5-b5b1-da4d34af14b2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.927007 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.930689 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.931704 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.932162 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.932559 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 09:54:02 crc kubenswrapper[4992]: I0131 09:54:02.934625 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8"] Jan 31 09:54:03 crc kubenswrapper[4992]: I0131 09:54:02.999172 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d26e4222-9259-4d18-a67c-a3890117d486-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-79nt8\" (UID: \"d26e4222-9259-4d18-a67c-a3890117d486\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" Jan 31 09:54:03 crc kubenswrapper[4992]: I0131 09:54:02.999244 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d26e4222-9259-4d18-a67c-a3890117d486-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-79nt8\" (UID: \"d26e4222-9259-4d18-a67c-a3890117d486\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" Jan 31 09:54:03 crc kubenswrapper[4992]: I0131 09:54:02.999306 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv7hc\" (UniqueName: \"kubernetes.io/projected/d26e4222-9259-4d18-a67c-a3890117d486-kube-api-access-sv7hc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-79nt8\" (UID: \"d26e4222-9259-4d18-a67c-a3890117d486\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" Jan 31 09:54:03 crc kubenswrapper[4992]: I0131 09:54:03.100687 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv7hc\" (UniqueName: \"kubernetes.io/projected/d26e4222-9259-4d18-a67c-a3890117d486-kube-api-access-sv7hc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-79nt8\" (UID: \"d26e4222-9259-4d18-a67c-a3890117d486\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" Jan 31 09:54:03 crc kubenswrapper[4992]: I0131 09:54:03.100864 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d26e4222-9259-4d18-a67c-a3890117d486-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-79nt8\" (UID: \"d26e4222-9259-4d18-a67c-a3890117d486\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" Jan 31 09:54:03 crc kubenswrapper[4992]: I0131 09:54:03.100906 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d26e4222-9259-4d18-a67c-a3890117d486-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-79nt8\" (UID: \"d26e4222-9259-4d18-a67c-a3890117d486\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" Jan 31 09:54:03 crc kubenswrapper[4992]: I0131 09:54:03.108254 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d26e4222-9259-4d18-a67c-a3890117d486-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-79nt8\" (UID: \"d26e4222-9259-4d18-a67c-a3890117d486\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" Jan 31 09:54:03 crc kubenswrapper[4992]: I0131 09:54:03.111034 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d26e4222-9259-4d18-a67c-a3890117d486-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-79nt8\" (UID: \"d26e4222-9259-4d18-a67c-a3890117d486\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" Jan 31 09:54:03 crc kubenswrapper[4992]: I0131 09:54:03.119741 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv7hc\" (UniqueName: \"kubernetes.io/projected/d26e4222-9259-4d18-a67c-a3890117d486-kube-api-access-sv7hc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-79nt8\" (UID: \"d26e4222-9259-4d18-a67c-a3890117d486\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" Jan 31 09:54:03 crc kubenswrapper[4992]: I0131 09:54:03.247494 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" Jan 31 09:54:03 crc kubenswrapper[4992]: I0131 09:54:03.790478 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8"] Jan 31 09:54:03 crc kubenswrapper[4992]: I0131 09:54:03.796243 4992 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:54:03 crc kubenswrapper[4992]: I0131 09:54:03.836015 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" event={"ID":"d26e4222-9259-4d18-a67c-a3890117d486","Type":"ContainerStarted","Data":"e6efde0af5bb18cfe09a500db309d1cb9f5ffb671463085224ac7b5f8adb3358"} Jan 31 09:54:05 crc kubenswrapper[4992]: I0131 09:54:05.861268 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" event={"ID":"d26e4222-9259-4d18-a67c-a3890117d486","Type":"ContainerStarted","Data":"5b476fc156520a50e29c812a048028fef35bc25e96080ff83975e220f1aca646"} Jan 31 09:54:09 crc kubenswrapper[4992]: I0131 09:54:09.899208 4992 generic.go:334] "Generic (PLEG): container finished" podID="d26e4222-9259-4d18-a67c-a3890117d486" containerID="5b476fc156520a50e29c812a048028fef35bc25e96080ff83975e220f1aca646" exitCode=0 Jan 31 09:54:09 crc kubenswrapper[4992]: I0131 09:54:09.899356 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" event={"ID":"d26e4222-9259-4d18-a67c-a3890117d486","Type":"ContainerDied","Data":"5b476fc156520a50e29c812a048028fef35bc25e96080ff83975e220f1aca646"} Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.364111 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.555487 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d26e4222-9259-4d18-a67c-a3890117d486-ssh-key-openstack-edpm-ipam\") pod \"d26e4222-9259-4d18-a67c-a3890117d486\" (UID: \"d26e4222-9259-4d18-a67c-a3890117d486\") " Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.555654 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv7hc\" (UniqueName: \"kubernetes.io/projected/d26e4222-9259-4d18-a67c-a3890117d486-kube-api-access-sv7hc\") pod \"d26e4222-9259-4d18-a67c-a3890117d486\" (UID: \"d26e4222-9259-4d18-a67c-a3890117d486\") " Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.555717 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d26e4222-9259-4d18-a67c-a3890117d486-inventory\") pod \"d26e4222-9259-4d18-a67c-a3890117d486\" (UID: \"d26e4222-9259-4d18-a67c-a3890117d486\") " Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.560743 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26e4222-9259-4d18-a67c-a3890117d486-kube-api-access-sv7hc" (OuterVolumeSpecName: "kube-api-access-sv7hc") pod "d26e4222-9259-4d18-a67c-a3890117d486" (UID: "d26e4222-9259-4d18-a67c-a3890117d486"). InnerVolumeSpecName "kube-api-access-sv7hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.582501 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d26e4222-9259-4d18-a67c-a3890117d486-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d26e4222-9259-4d18-a67c-a3890117d486" (UID: "d26e4222-9259-4d18-a67c-a3890117d486"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.582920 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d26e4222-9259-4d18-a67c-a3890117d486-inventory" (OuterVolumeSpecName: "inventory") pod "d26e4222-9259-4d18-a67c-a3890117d486" (UID: "d26e4222-9259-4d18-a67c-a3890117d486"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.659440 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv7hc\" (UniqueName: \"kubernetes.io/projected/d26e4222-9259-4d18-a67c-a3890117d486-kube-api-access-sv7hc\") on node \"crc\" DevicePath \"\"" Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.659500 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d26e4222-9259-4d18-a67c-a3890117d486-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.659525 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d26e4222-9259-4d18-a67c-a3890117d486-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.935462 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" event={"ID":"d26e4222-9259-4d18-a67c-a3890117d486","Type":"ContainerDied","Data":"e6efde0af5bb18cfe09a500db309d1cb9f5ffb671463085224ac7b5f8adb3358"} Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.935530 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6efde0af5bb18cfe09a500db309d1cb9f5ffb671463085224ac7b5f8adb3358" Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.935774 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8" Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.994683 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr"] Jan 31 09:54:11 crc kubenswrapper[4992]: E0131 09:54:11.996271 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26e4222-9259-4d18-a67c-a3890117d486" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.996464 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26e4222-9259-4d18-a67c-a3890117d486" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.996797 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26e4222-9259-4d18-a67c-a3890117d486" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 31 09:54:11 crc kubenswrapper[4992]: I0131 09:54:11.997558 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.000719 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.001023 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.001745 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.001986 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.007333 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr"] Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.060270 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-sb5bv"] Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.066059 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhqlr\" (UID: \"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.066136 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhqlr\" (UID: \"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.066177 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnw8c\" (UniqueName: \"kubernetes.io/projected/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-kube-api-access-fnw8c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhqlr\" (UID: \"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.067238 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-tfxcr"] Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.079220 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-q7bsh"] Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.092499 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2fa6-account-create-update-cjr6n"] Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.102871 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-tfxcr"] Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.110892 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-sb5bv"] Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.118053 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-q7bsh"] Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.125260 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2fa6-account-create-update-cjr6n"] Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.167765 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhqlr\" (UID: \"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.168072 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhqlr\" (UID: \"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.168218 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnw8c\" (UniqueName: \"kubernetes.io/projected/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-kube-api-access-fnw8c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhqlr\" (UID: \"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.172711 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhqlr\" (UID: \"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.174845 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhqlr\" (UID: \"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.183070 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:54:12 crc kubenswrapper[4992]: E0131 09:54:12.183389 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.192307 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnw8c\" (UniqueName: \"kubernetes.io/projected/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-kube-api-access-fnw8c\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhqlr\" (UID: \"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.322870 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" Jan 31 09:54:12 crc kubenswrapper[4992]: W0131 09:54:12.840784 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod361ac32c_fc8e_4a26_ac92_c64b5bba4ffd.slice/crio-ed16a6f31c495349ad6244537ba2f16b45e49b58c6df92c324f6eac485a98a88 WatchSource:0}: Error finding container ed16a6f31c495349ad6244537ba2f16b45e49b58c6df92c324f6eac485a98a88: Status 404 returned error can't find the container with id ed16a6f31c495349ad6244537ba2f16b45e49b58c6df92c324f6eac485a98a88 Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.843495 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr"] Jan 31 09:54:12 crc kubenswrapper[4992]: I0131 09:54:12.947518 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" event={"ID":"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd","Type":"ContainerStarted","Data":"ed16a6f31c495349ad6244537ba2f16b45e49b58c6df92c324f6eac485a98a88"} Jan 31 09:54:13 crc kubenswrapper[4992]: I0131 09:54:13.037314 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-fa36-account-create-update-cczt6"] Jan 31 09:54:13 crc kubenswrapper[4992]: I0131 09:54:13.050777 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c015-account-create-update-lm7x4"] Jan 31 09:54:13 crc kubenswrapper[4992]: I0131 09:54:13.062114 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-fa36-account-create-update-cczt6"] Jan 31 09:54:13 crc kubenswrapper[4992]: I0131 09:54:13.073024 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c015-account-create-update-lm7x4"] Jan 31 09:54:13 crc kubenswrapper[4992]: I0131 09:54:13.192552 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35185358-7b72-425e-af0b-c52b7887ce93" path="/var/lib/kubelet/pods/35185358-7b72-425e-af0b-c52b7887ce93/volumes" Jan 31 09:54:13 crc kubenswrapper[4992]: I0131 09:54:13.193632 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d4c3a5c-ab08-4b94-877e-73e2641429d4" path="/var/lib/kubelet/pods/3d4c3a5c-ab08-4b94-877e-73e2641429d4/volumes" Jan 31 09:54:13 crc kubenswrapper[4992]: I0131 09:54:13.194306 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="690c5d16-8767-4215-adcc-6c52a3f214f9" path="/var/lib/kubelet/pods/690c5d16-8767-4215-adcc-6c52a3f214f9/volumes" Jan 31 09:54:13 crc kubenswrapper[4992]: I0131 09:54:13.194897 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab96b3c5-39bc-40ae-a1eb-2a751e90c944" path="/var/lib/kubelet/pods/ab96b3c5-39bc-40ae-a1eb-2a751e90c944/volumes" Jan 31 09:54:13 crc kubenswrapper[4992]: I0131 09:54:13.195928 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d23756fe-5e0d-43f4-a977-a9058b096998" path="/var/lib/kubelet/pods/d23756fe-5e0d-43f4-a977-a9058b096998/volumes" Jan 31 09:54:13 crc kubenswrapper[4992]: I0131 09:54:13.196476 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f80b2ba9-24f2-44ec-a523-74a843ee40dd" path="/var/lib/kubelet/pods/f80b2ba9-24f2-44ec-a523-74a843ee40dd/volumes" Jan 31 09:54:13 crc kubenswrapper[4992]: I0131 09:54:13.957014 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" event={"ID":"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd","Type":"ContainerStarted","Data":"514a43dad636cc56e6082511525f06b2252abcac2c2788ab6cfb237aeb178c58"} Jan 31 09:54:13 crc kubenswrapper[4992]: I0131 09:54:13.994272 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" podStartSLOduration=2.595913527 podStartE2EDuration="2.994245038s" podCreationTimestamp="2026-01-31 09:54:11 +0000 UTC" firstStartedPulling="2026-01-31 09:54:12.844150417 +0000 UTC m=+1748.815542404" lastFinishedPulling="2026-01-31 09:54:13.242481918 +0000 UTC m=+1749.213873915" observedRunningTime="2026-01-31 09:54:13.981388019 +0000 UTC m=+1749.952780006" watchObservedRunningTime="2026-01-31 09:54:13.994245038 +0000 UTC m=+1749.965637055" Jan 31 09:54:16 crc kubenswrapper[4992]: I0131 09:54:16.161060 4992 scope.go:117] "RemoveContainer" containerID="6c68440cf9fcfce76bd1d78c1272c4a1440d6b4a940a03cf907014d0d8b28d84" Jan 31 09:54:16 crc kubenswrapper[4992]: I0131 09:54:16.185987 4992 scope.go:117] "RemoveContainer" containerID="20cfb4142b2181223dbe1e2a0e38750978856209645f96870a80021175c07ee8" Jan 31 09:54:16 crc kubenswrapper[4992]: I0131 09:54:16.230650 4992 scope.go:117] "RemoveContainer" containerID="a125c6652d5a45fddfe04455bb030d8b1b580ef1b6ddf6353284f470e3f68563" Jan 31 09:54:16 crc kubenswrapper[4992]: I0131 09:54:16.274946 4992 scope.go:117] "RemoveContainer" containerID="cc7fd2a0dd88b452290fbb5cc0e910cd2b7723f244931e175efb443ec0d57191" Jan 31 09:54:16 crc kubenswrapper[4992]: I0131 09:54:16.311880 4992 scope.go:117] "RemoveContainer" containerID="f53b8af4592fabf7cc60bff83aca365372f9799cdb7b03014b435fc1c76e961f" Jan 31 09:54:16 crc kubenswrapper[4992]: I0131 09:54:16.347757 4992 scope.go:117] "RemoveContainer" containerID="4c762f2076813a7d523ae8961273df420ec2c0ed0d7ef616a69860a461c91864" Jan 31 09:54:16 crc kubenswrapper[4992]: I0131 09:54:16.384559 4992 scope.go:117] "RemoveContainer" containerID="da1bb5046b900cafc0140c7654537e65d84bfe3cae67fa00a8f81a7cf68e79bd" Jan 31 09:54:16 crc kubenswrapper[4992]: I0131 09:54:16.401158 4992 scope.go:117] "RemoveContainer" containerID="b915d530c1d6e23ea3b77b88f1cb76e04cb9a79cb0b988cb1a621ddb4e8d6304" Jan 31 09:54:16 crc kubenswrapper[4992]: I0131 09:54:16.421341 4992 scope.go:117] "RemoveContainer" containerID="c8dbd09a79b362ae518f1af8233235c2d81e2132f2ca037d570628b1a9b467d4" Jan 31 09:54:16 crc kubenswrapper[4992]: I0131 09:54:16.456782 4992 scope.go:117] "RemoveContainer" containerID="d65b48f8cf341fdd90cf9a1a76c3229b7c44a83699a17d5c00c62fecf4da588e" Jan 31 09:54:16 crc kubenswrapper[4992]: I0131 09:54:16.479779 4992 scope.go:117] "RemoveContainer" containerID="b6b8f0fb362912d1a816259fb0b3d0d5f42c08100b7fbb93c4a53af70ba6456b" Jan 31 09:54:16 crc kubenswrapper[4992]: I0131 09:54:16.499221 4992 scope.go:117] "RemoveContainer" containerID="4e49450a3e88467ef58c8a1ade1b4326bab7416669730f3e9ec9b4c1db4447e9" Jan 31 09:54:16 crc kubenswrapper[4992]: I0131 09:54:16.520512 4992 scope.go:117] "RemoveContainer" containerID="bec59689753821e3d838bb5be9f9d7584201e77be80bd347e712c396937ba831" Jan 31 09:54:16 crc kubenswrapper[4992]: I0131 09:54:16.540238 4992 scope.go:117] "RemoveContainer" containerID="e571af91bd1193e5938ccaa793ae49bc150646e1a06122e8f7850f386fb01fb9" Jan 31 09:54:23 crc kubenswrapper[4992]: I0131 09:54:23.045118 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-lw589"] Jan 31 09:54:23 crc kubenswrapper[4992]: I0131 09:54:23.057206 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-lw589"] Jan 31 09:54:23 crc kubenswrapper[4992]: I0131 09:54:23.192190 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ac29a1f-926d-44c3-b380-4f48340ad9ce" path="/var/lib/kubelet/pods/9ac29a1f-926d-44c3-b380-4f48340ad9ce/volumes" Jan 31 09:54:25 crc kubenswrapper[4992]: I0131 09:54:25.187692 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:54:25 crc kubenswrapper[4992]: E0131 09:54:25.188203 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:54:40 crc kubenswrapper[4992]: I0131 09:54:40.183402 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:54:40 crc kubenswrapper[4992]: E0131 09:54:40.184384 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:54:47 crc kubenswrapper[4992]: I0131 09:54:47.271669 4992 generic.go:334] "Generic (PLEG): container finished" podID="361ac32c-fc8e-4a26-ac92-c64b5bba4ffd" containerID="514a43dad636cc56e6082511525f06b2252abcac2c2788ab6cfb237aeb178c58" exitCode=0 Jan 31 09:54:47 crc kubenswrapper[4992]: I0131 09:54:47.271747 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" event={"ID":"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd","Type":"ContainerDied","Data":"514a43dad636cc56e6082511525f06b2252abcac2c2788ab6cfb237aeb178c58"} Jan 31 09:54:48 crc kubenswrapper[4992]: I0131 09:54:48.651449 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" Jan 31 09:54:48 crc kubenswrapper[4992]: I0131 09:54:48.814250 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-ssh-key-openstack-edpm-ipam\") pod \"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd\" (UID: \"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd\") " Jan 31 09:54:48 crc kubenswrapper[4992]: I0131 09:54:48.814464 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-inventory\") pod \"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd\" (UID: \"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd\") " Jan 31 09:54:48 crc kubenswrapper[4992]: I0131 09:54:48.814762 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnw8c\" (UniqueName: \"kubernetes.io/projected/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-kube-api-access-fnw8c\") pod \"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd\" (UID: \"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd\") " Jan 31 09:54:48 crc kubenswrapper[4992]: I0131 09:54:48.825608 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-kube-api-access-fnw8c" (OuterVolumeSpecName: "kube-api-access-fnw8c") pod "361ac32c-fc8e-4a26-ac92-c64b5bba4ffd" (UID: "361ac32c-fc8e-4a26-ac92-c64b5bba4ffd"). InnerVolumeSpecName "kube-api-access-fnw8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:54:48 crc kubenswrapper[4992]: I0131 09:54:48.841471 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "361ac32c-fc8e-4a26-ac92-c64b5bba4ffd" (UID: "361ac32c-fc8e-4a26-ac92-c64b5bba4ffd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:54:48 crc kubenswrapper[4992]: I0131 09:54:48.844500 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-inventory" (OuterVolumeSpecName: "inventory") pod "361ac32c-fc8e-4a26-ac92-c64b5bba4ffd" (UID: "361ac32c-fc8e-4a26-ac92-c64b5bba4ffd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:54:48 crc kubenswrapper[4992]: I0131 09:54:48.918239 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:54:48 crc kubenswrapper[4992]: I0131 09:54:48.918282 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:54:48 crc kubenswrapper[4992]: I0131 09:54:48.918292 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnw8c\" (UniqueName: \"kubernetes.io/projected/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd-kube-api-access-fnw8c\") on node \"crc\" DevicePath \"\"" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.290584 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" event={"ID":"361ac32c-fc8e-4a26-ac92-c64b5bba4ffd","Type":"ContainerDied","Data":"ed16a6f31c495349ad6244537ba2f16b45e49b58c6df92c324f6eac485a98a88"} Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.290642 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed16a6f31c495349ad6244537ba2f16b45e49b58c6df92c324f6eac485a98a88" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.290663 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.388878 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk"] Jan 31 09:54:49 crc kubenswrapper[4992]: E0131 09:54:49.389231 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361ac32c-fc8e-4a26-ac92-c64b5bba4ffd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.389249 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="361ac32c-fc8e-4a26-ac92-c64b5bba4ffd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.389495 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="361ac32c-fc8e-4a26-ac92-c64b5bba4ffd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.390092 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.392049 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.392290 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.392440 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.393493 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.397939 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk"] Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.531128 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/684b95fc-cf60-4200-84d5-e7024abd3534-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk\" (UID: \"684b95fc-cf60-4200-84d5-e7024abd3534\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.531191 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/684b95fc-cf60-4200-84d5-e7024abd3534-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk\" (UID: \"684b95fc-cf60-4200-84d5-e7024abd3534\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.531342 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm5s6\" (UniqueName: \"kubernetes.io/projected/684b95fc-cf60-4200-84d5-e7024abd3534-kube-api-access-tm5s6\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk\" (UID: \"684b95fc-cf60-4200-84d5-e7024abd3534\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.632745 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm5s6\" (UniqueName: \"kubernetes.io/projected/684b95fc-cf60-4200-84d5-e7024abd3534-kube-api-access-tm5s6\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk\" (UID: \"684b95fc-cf60-4200-84d5-e7024abd3534\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.632854 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/684b95fc-cf60-4200-84d5-e7024abd3534-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk\" (UID: \"684b95fc-cf60-4200-84d5-e7024abd3534\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.632882 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/684b95fc-cf60-4200-84d5-e7024abd3534-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk\" (UID: \"684b95fc-cf60-4200-84d5-e7024abd3534\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.637839 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/684b95fc-cf60-4200-84d5-e7024abd3534-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk\" (UID: \"684b95fc-cf60-4200-84d5-e7024abd3534\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.647400 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/684b95fc-cf60-4200-84d5-e7024abd3534-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk\" (UID: \"684b95fc-cf60-4200-84d5-e7024abd3534\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.651065 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm5s6\" (UniqueName: \"kubernetes.io/projected/684b95fc-cf60-4200-84d5-e7024abd3534-kube-api-access-tm5s6\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk\" (UID: \"684b95fc-cf60-4200-84d5-e7024abd3534\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" Jan 31 09:54:49 crc kubenswrapper[4992]: I0131 09:54:49.706069 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" Jan 31 09:54:50 crc kubenswrapper[4992]: I0131 09:54:50.229791 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk"] Jan 31 09:54:50 crc kubenswrapper[4992]: I0131 09:54:50.300959 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" event={"ID":"684b95fc-cf60-4200-84d5-e7024abd3534","Type":"ContainerStarted","Data":"24c927a0a7446197be0cbd69d9164525e829e2615f209f8014133ad6b5d31256"} Jan 31 09:54:51 crc kubenswrapper[4992]: I0131 09:54:51.308726 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" event={"ID":"684b95fc-cf60-4200-84d5-e7024abd3534","Type":"ContainerStarted","Data":"c9b93c36e46afdf9050b50c139c7f1ca101ea6375add174e8e8a2e8ab272a583"} Jan 31 09:54:51 crc kubenswrapper[4992]: I0131 09:54:51.331707 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" podStartSLOduration=1.929254732 podStartE2EDuration="2.33169087s" podCreationTimestamp="2026-01-31 09:54:49 +0000 UTC" firstStartedPulling="2026-01-31 09:54:50.240295728 +0000 UTC m=+1786.211687715" lastFinishedPulling="2026-01-31 09:54:50.642731846 +0000 UTC m=+1786.614123853" observedRunningTime="2026-01-31 09:54:51.329141167 +0000 UTC m=+1787.300533154" watchObservedRunningTime="2026-01-31 09:54:51.33169087 +0000 UTC m=+1787.303082857" Jan 31 09:54:52 crc kubenswrapper[4992]: I0131 09:54:52.182197 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:54:52 crc kubenswrapper[4992]: E0131 09:54:52.182804 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:54:53 crc kubenswrapper[4992]: I0131 09:54:53.038985 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-4mh4k"] Jan 31 09:54:53 crc kubenswrapper[4992]: I0131 09:54:53.047037 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-4mh4k"] Jan 31 09:54:53 crc kubenswrapper[4992]: I0131 09:54:53.190893 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf4a75ee-0abf-46e9-ac05-14641a2fd782" path="/var/lib/kubelet/pods/cf4a75ee-0abf-46e9-ac05-14641a2fd782/volumes" Jan 31 09:54:55 crc kubenswrapper[4992]: I0131 09:54:55.361180 4992 generic.go:334] "Generic (PLEG): container finished" podID="684b95fc-cf60-4200-84d5-e7024abd3534" containerID="c9b93c36e46afdf9050b50c139c7f1ca101ea6375add174e8e8a2e8ab272a583" exitCode=0 Jan 31 09:54:55 crc kubenswrapper[4992]: I0131 09:54:55.361285 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" event={"ID":"684b95fc-cf60-4200-84d5-e7024abd3534","Type":"ContainerDied","Data":"c9b93c36e46afdf9050b50c139c7f1ca101ea6375add174e8e8a2e8ab272a583"} Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:56.772702 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:56.868060 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm5s6\" (UniqueName: \"kubernetes.io/projected/684b95fc-cf60-4200-84d5-e7024abd3534-kube-api-access-tm5s6\") pod \"684b95fc-cf60-4200-84d5-e7024abd3534\" (UID: \"684b95fc-cf60-4200-84d5-e7024abd3534\") " Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:56.868156 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/684b95fc-cf60-4200-84d5-e7024abd3534-ssh-key-openstack-edpm-ipam\") pod \"684b95fc-cf60-4200-84d5-e7024abd3534\" (UID: \"684b95fc-cf60-4200-84d5-e7024abd3534\") " Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:56.868355 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/684b95fc-cf60-4200-84d5-e7024abd3534-inventory\") pod \"684b95fc-cf60-4200-84d5-e7024abd3534\" (UID: \"684b95fc-cf60-4200-84d5-e7024abd3534\") " Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:56.873859 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/684b95fc-cf60-4200-84d5-e7024abd3534-kube-api-access-tm5s6" (OuterVolumeSpecName: "kube-api-access-tm5s6") pod "684b95fc-cf60-4200-84d5-e7024abd3534" (UID: "684b95fc-cf60-4200-84d5-e7024abd3534"). InnerVolumeSpecName "kube-api-access-tm5s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:56.894129 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684b95fc-cf60-4200-84d5-e7024abd3534-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "684b95fc-cf60-4200-84d5-e7024abd3534" (UID: "684b95fc-cf60-4200-84d5-e7024abd3534"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:56.896166 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684b95fc-cf60-4200-84d5-e7024abd3534-inventory" (OuterVolumeSpecName: "inventory") pod "684b95fc-cf60-4200-84d5-e7024abd3534" (UID: "684b95fc-cf60-4200-84d5-e7024abd3534"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:56.970275 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm5s6\" (UniqueName: \"kubernetes.io/projected/684b95fc-cf60-4200-84d5-e7024abd3534-kube-api-access-tm5s6\") on node \"crc\" DevicePath \"\"" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:56.970302 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/684b95fc-cf60-4200-84d5-e7024abd3534-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:56.970315 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/684b95fc-cf60-4200-84d5-e7024abd3534-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.386577 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" event={"ID":"684b95fc-cf60-4200-84d5-e7024abd3534","Type":"ContainerDied","Data":"24c927a0a7446197be0cbd69d9164525e829e2615f209f8014133ad6b5d31256"} Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.386615 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24c927a0a7446197be0cbd69d9164525e829e2615f209f8014133ad6b5d31256" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.386678 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.467755 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd"] Jan 31 09:54:57 crc kubenswrapper[4992]: E0131 09:54:57.468488 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684b95fc-cf60-4200-84d5-e7024abd3534" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.468512 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="684b95fc-cf60-4200-84d5-e7024abd3534" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.468726 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="684b95fc-cf60-4200-84d5-e7024abd3534" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.469383 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.471829 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.472070 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.472273 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.472650 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.478709 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd"] Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.583589 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72c46\" (UniqueName: \"kubernetes.io/projected/e671e97e-21df-468c-8142-c4da5165814c-kube-api-access-72c46\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd\" (UID: \"e671e97e-21df-468c-8142-c4da5165814c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.583749 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e671e97e-21df-468c-8142-c4da5165814c-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd\" (UID: \"e671e97e-21df-468c-8142-c4da5165814c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.583864 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e671e97e-21df-468c-8142-c4da5165814c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd\" (UID: \"e671e97e-21df-468c-8142-c4da5165814c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.685937 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e671e97e-21df-468c-8142-c4da5165814c-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd\" (UID: \"e671e97e-21df-468c-8142-c4da5165814c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.686037 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e671e97e-21df-468c-8142-c4da5165814c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd\" (UID: \"e671e97e-21df-468c-8142-c4da5165814c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.686069 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72c46\" (UniqueName: \"kubernetes.io/projected/e671e97e-21df-468c-8142-c4da5165814c-kube-api-access-72c46\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd\" (UID: \"e671e97e-21df-468c-8142-c4da5165814c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.690029 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e671e97e-21df-468c-8142-c4da5165814c-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd\" (UID: \"e671e97e-21df-468c-8142-c4da5165814c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.690029 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e671e97e-21df-468c-8142-c4da5165814c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd\" (UID: \"e671e97e-21df-468c-8142-c4da5165814c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.716598 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72c46\" (UniqueName: \"kubernetes.io/projected/e671e97e-21df-468c-8142-c4da5165814c-kube-api-access-72c46\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd\" (UID: \"e671e97e-21df-468c-8142-c4da5165814c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" Jan 31 09:54:57 crc kubenswrapper[4992]: I0131 09:54:57.786152 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" Jan 31 09:54:58 crc kubenswrapper[4992]: I0131 09:54:58.341257 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd"] Jan 31 09:54:58 crc kubenswrapper[4992]: I0131 09:54:58.401818 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" event={"ID":"e671e97e-21df-468c-8142-c4da5165814c","Type":"ContainerStarted","Data":"b27f6756cc3424305b5a33b8cbaedb24c40c44eb3f1ea8072e1bb0f81bff4b1f"} Jan 31 09:54:59 crc kubenswrapper[4992]: I0131 09:54:59.429275 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" event={"ID":"e671e97e-21df-468c-8142-c4da5165814c","Type":"ContainerStarted","Data":"cf3d1de6c588d3f54134231504b6eac26a013144b733bc5f1bee9ac6d84c4a80"} Jan 31 09:54:59 crc kubenswrapper[4992]: I0131 09:54:59.450040 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" podStartSLOduration=2.018899218 podStartE2EDuration="2.450024221s" podCreationTimestamp="2026-01-31 09:54:57 +0000 UTC" firstStartedPulling="2026-01-31 09:54:58.35279391 +0000 UTC m=+1794.324185897" lastFinishedPulling="2026-01-31 09:54:58.783918913 +0000 UTC m=+1794.755310900" observedRunningTime="2026-01-31 09:54:59.443136513 +0000 UTC m=+1795.414528510" watchObservedRunningTime="2026-01-31 09:54:59.450024221 +0000 UTC m=+1795.421416208" Jan 31 09:55:04 crc kubenswrapper[4992]: I0131 09:55:04.183579 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:55:04 crc kubenswrapper[4992]: E0131 09:55:04.184141 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:55:05 crc kubenswrapper[4992]: I0131 09:55:05.044934 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-mbgzl"] Jan 31 09:55:05 crc kubenswrapper[4992]: I0131 09:55:05.054832 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-52w86"] Jan 31 09:55:05 crc kubenswrapper[4992]: I0131 09:55:05.063772 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-mbgzl"] Jan 31 09:55:05 crc kubenswrapper[4992]: I0131 09:55:05.071484 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-52w86"] Jan 31 09:55:05 crc kubenswrapper[4992]: I0131 09:55:05.193658 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf1be806-c02e-4606-94ea-438caf8ef9c6" path="/var/lib/kubelet/pods/cf1be806-c02e-4606-94ea-438caf8ef9c6/volumes" Jan 31 09:55:05 crc kubenswrapper[4992]: I0131 09:55:05.194379 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db1e7454-fec7-4ec7-a2e2-5e4ebb145213" path="/var/lib/kubelet/pods/db1e7454-fec7-4ec7-a2e2-5e4ebb145213/volumes" Jan 31 09:55:13 crc kubenswrapper[4992]: I0131 09:55:13.028260 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-lbgq7"] Jan 31 09:55:13 crc kubenswrapper[4992]: I0131 09:55:13.038257 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-lbgq7"] Jan 31 09:55:13 crc kubenswrapper[4992]: I0131 09:55:13.195850 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02f6c85a-822e-4864-b0aa-1c487d73721c" path="/var/lib/kubelet/pods/02f6c85a-822e-4864-b0aa-1c487d73721c/volumes" Jan 31 09:55:14 crc kubenswrapper[4992]: I0131 09:55:14.027942 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9dxlh"] Jan 31 09:55:14 crc kubenswrapper[4992]: I0131 09:55:14.038613 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9dxlh"] Jan 31 09:55:15 crc kubenswrapper[4992]: I0131 09:55:15.195308 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57f531e7-e05e-4537-bb22-01911330abd2" path="/var/lib/kubelet/pods/57f531e7-e05e-4537-bb22-01911330abd2/volumes" Jan 31 09:55:16 crc kubenswrapper[4992]: I0131 09:55:16.759808 4992 scope.go:117] "RemoveContainer" containerID="a5becdc9ce36f7d8808f5ce6cf9e9e9a2a315551ba934d4e45a418cedea58924" Jan 31 09:55:16 crc kubenswrapper[4992]: I0131 09:55:16.801024 4992 scope.go:117] "RemoveContainer" containerID="69e839170a296d37a2791c1c4b3a5531525a1d55c14766d4f8f8785969eeef4f" Jan 31 09:55:16 crc kubenswrapper[4992]: I0131 09:55:16.850548 4992 scope.go:117] "RemoveContainer" containerID="2d5b2cb07e909d54987e1d03967060cd8eb2bb0c5bd742d43d89cf64e66516c8" Jan 31 09:55:16 crc kubenswrapper[4992]: I0131 09:55:16.892360 4992 scope.go:117] "RemoveContainer" containerID="5085c4e9275d593ec2f9e83b58a2cdfefb630195561b76133d1f4174685ae5a0" Jan 31 09:55:16 crc kubenswrapper[4992]: I0131 09:55:16.954314 4992 scope.go:117] "RemoveContainer" containerID="9e44dbfa9195a0240df096afe4c92d21c3e0bcab9078bdb9ca7f5640b797ef34" Jan 31 09:55:17 crc kubenswrapper[4992]: I0131 09:55:17.026200 4992 scope.go:117] "RemoveContainer" containerID="36c8234fe1065dca43d3dc49b6de769bc057e035a2564c2a249ff2eb5b982c7c" Jan 31 09:55:19 crc kubenswrapper[4992]: I0131 09:55:19.182542 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:55:19 crc kubenswrapper[4992]: E0131 09:55:19.183082 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:55:31 crc kubenswrapper[4992]: I0131 09:55:31.184386 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:55:31 crc kubenswrapper[4992]: E0131 09:55:31.185333 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:55:42 crc kubenswrapper[4992]: I0131 09:55:42.850633 4992 generic.go:334] "Generic (PLEG): container finished" podID="e671e97e-21df-468c-8142-c4da5165814c" containerID="cf3d1de6c588d3f54134231504b6eac26a013144b733bc5f1bee9ac6d84c4a80" exitCode=0 Jan 31 09:55:42 crc kubenswrapper[4992]: I0131 09:55:42.850746 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" event={"ID":"e671e97e-21df-468c-8142-c4da5165814c","Type":"ContainerDied","Data":"cf3d1de6c588d3f54134231504b6eac26a013144b733bc5f1bee9ac6d84c4a80"} Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.182918 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:55:44 crc kubenswrapper[4992]: E0131 09:55:44.183855 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.274005 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.346644 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72c46\" (UniqueName: \"kubernetes.io/projected/e671e97e-21df-468c-8142-c4da5165814c-kube-api-access-72c46\") pod \"e671e97e-21df-468c-8142-c4da5165814c\" (UID: \"e671e97e-21df-468c-8142-c4da5165814c\") " Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.346822 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e671e97e-21df-468c-8142-c4da5165814c-inventory\") pod \"e671e97e-21df-468c-8142-c4da5165814c\" (UID: \"e671e97e-21df-468c-8142-c4da5165814c\") " Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.346941 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e671e97e-21df-468c-8142-c4da5165814c-ssh-key-openstack-edpm-ipam\") pod \"e671e97e-21df-468c-8142-c4da5165814c\" (UID: \"e671e97e-21df-468c-8142-c4da5165814c\") " Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.354118 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e671e97e-21df-468c-8142-c4da5165814c-kube-api-access-72c46" (OuterVolumeSpecName: "kube-api-access-72c46") pod "e671e97e-21df-468c-8142-c4da5165814c" (UID: "e671e97e-21df-468c-8142-c4da5165814c"). InnerVolumeSpecName "kube-api-access-72c46". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.377126 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e671e97e-21df-468c-8142-c4da5165814c-inventory" (OuterVolumeSpecName: "inventory") pod "e671e97e-21df-468c-8142-c4da5165814c" (UID: "e671e97e-21df-468c-8142-c4da5165814c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.385148 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e671e97e-21df-468c-8142-c4da5165814c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e671e97e-21df-468c-8142-c4da5165814c" (UID: "e671e97e-21df-468c-8142-c4da5165814c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.450568 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72c46\" (UniqueName: \"kubernetes.io/projected/e671e97e-21df-468c-8142-c4da5165814c-kube-api-access-72c46\") on node \"crc\" DevicePath \"\"" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.450617 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e671e97e-21df-468c-8142-c4da5165814c-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.450639 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e671e97e-21df-468c-8142-c4da5165814c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.871443 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" event={"ID":"e671e97e-21df-468c-8142-c4da5165814c","Type":"ContainerDied","Data":"b27f6756cc3424305b5a33b8cbaedb24c40c44eb3f1ea8072e1bb0f81bff4b1f"} Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.871741 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b27f6756cc3424305b5a33b8cbaedb24c40c44eb3f1ea8072e1bb0f81bff4b1f" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.871499 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.963395 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fhwbp"] Jan 31 09:55:44 crc kubenswrapper[4992]: E0131 09:55:44.964023 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e671e97e-21df-468c-8142-c4da5165814c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.964057 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="e671e97e-21df-468c-8142-c4da5165814c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.964450 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="e671e97e-21df-468c-8142-c4da5165814c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.965414 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.968135 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.970896 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.971190 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.971504 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:55:44 crc kubenswrapper[4992]: I0131 09:55:44.976782 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fhwbp"] Jan 31 09:55:45 crc kubenswrapper[4992]: I0131 09:55:45.063007 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-fhwbp\" (UID: \"7b59e5b4-c357-4fb5-8606-6a7fef1519cf\") " pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" Jan 31 09:55:45 crc kubenswrapper[4992]: I0131 09:55:45.063324 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-fhwbp\" (UID: \"7b59e5b4-c357-4fb5-8606-6a7fef1519cf\") " pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" Jan 31 09:55:45 crc kubenswrapper[4992]: I0131 09:55:45.063525 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwxnm\" (UniqueName: \"kubernetes.io/projected/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-kube-api-access-cwxnm\") pod \"ssh-known-hosts-edpm-deployment-fhwbp\" (UID: \"7b59e5b4-c357-4fb5-8606-6a7fef1519cf\") " pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" Jan 31 09:55:45 crc kubenswrapper[4992]: I0131 09:55:45.165529 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-fhwbp\" (UID: \"7b59e5b4-c357-4fb5-8606-6a7fef1519cf\") " pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" Jan 31 09:55:45 crc kubenswrapper[4992]: I0131 09:55:45.165675 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-fhwbp\" (UID: \"7b59e5b4-c357-4fb5-8606-6a7fef1519cf\") " pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" Jan 31 09:55:45 crc kubenswrapper[4992]: I0131 09:55:45.165788 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwxnm\" (UniqueName: \"kubernetes.io/projected/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-kube-api-access-cwxnm\") pod \"ssh-known-hosts-edpm-deployment-fhwbp\" (UID: \"7b59e5b4-c357-4fb5-8606-6a7fef1519cf\") " pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" Jan 31 09:55:45 crc kubenswrapper[4992]: I0131 09:55:45.171580 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-fhwbp\" (UID: \"7b59e5b4-c357-4fb5-8606-6a7fef1519cf\") " pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" Jan 31 09:55:45 crc kubenswrapper[4992]: I0131 09:55:45.182258 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-fhwbp\" (UID: \"7b59e5b4-c357-4fb5-8606-6a7fef1519cf\") " pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" Jan 31 09:55:45 crc kubenswrapper[4992]: I0131 09:55:45.187218 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwxnm\" (UniqueName: \"kubernetes.io/projected/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-kube-api-access-cwxnm\") pod \"ssh-known-hosts-edpm-deployment-fhwbp\" (UID: \"7b59e5b4-c357-4fb5-8606-6a7fef1519cf\") " pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" Jan 31 09:55:45 crc kubenswrapper[4992]: I0131 09:55:45.294123 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" Jan 31 09:55:45 crc kubenswrapper[4992]: I0131 09:55:45.859066 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fhwbp"] Jan 31 09:55:45 crc kubenswrapper[4992]: I0131 09:55:45.880114 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" event={"ID":"7b59e5b4-c357-4fb5-8606-6a7fef1519cf","Type":"ContainerStarted","Data":"30324bca244eabc98870f520ff7164f10d85967902e7792e765ba9545f8406d3"} Jan 31 09:55:46 crc kubenswrapper[4992]: I0131 09:55:46.891133 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" event={"ID":"7b59e5b4-c357-4fb5-8606-6a7fef1519cf","Type":"ContainerStarted","Data":"ca35c71fd7ee98003c37eabae76a9ad3ba72586edc1c63d97114b17124ffd278"} Jan 31 09:55:46 crc kubenswrapper[4992]: I0131 09:55:46.922541 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" podStartSLOduration=2.4670872360000002 podStartE2EDuration="2.922510537s" podCreationTimestamp="2026-01-31 09:55:44 +0000 UTC" firstStartedPulling="2026-01-31 09:55:45.863599618 +0000 UTC m=+1841.834991625" lastFinishedPulling="2026-01-31 09:55:46.319022939 +0000 UTC m=+1842.290414926" observedRunningTime="2026-01-31 09:55:46.910572084 +0000 UTC m=+1842.881964101" watchObservedRunningTime="2026-01-31 09:55:46.922510537 +0000 UTC m=+1842.893902534" Jan 31 09:55:52 crc kubenswrapper[4992]: I0131 09:55:52.950437 4992 generic.go:334] "Generic (PLEG): container finished" podID="7b59e5b4-c357-4fb5-8606-6a7fef1519cf" containerID="ca35c71fd7ee98003c37eabae76a9ad3ba72586edc1c63d97114b17124ffd278" exitCode=0 Jan 31 09:55:52 crc kubenswrapper[4992]: I0131 09:55:52.950494 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" event={"ID":"7b59e5b4-c357-4fb5-8606-6a7fef1519cf","Type":"ContainerDied","Data":"ca35c71fd7ee98003c37eabae76a9ad3ba72586edc1c63d97114b17124ffd278"} Jan 31 09:55:54 crc kubenswrapper[4992]: I0131 09:55:54.385783 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" Jan 31 09:55:54 crc kubenswrapper[4992]: I0131 09:55:54.437986 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-ssh-key-openstack-edpm-ipam\") pod \"7b59e5b4-c357-4fb5-8606-6a7fef1519cf\" (UID: \"7b59e5b4-c357-4fb5-8606-6a7fef1519cf\") " Jan 31 09:55:54 crc kubenswrapper[4992]: I0131 09:55:54.438339 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwxnm\" (UniqueName: \"kubernetes.io/projected/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-kube-api-access-cwxnm\") pod \"7b59e5b4-c357-4fb5-8606-6a7fef1519cf\" (UID: \"7b59e5b4-c357-4fb5-8606-6a7fef1519cf\") " Jan 31 09:55:54 crc kubenswrapper[4992]: I0131 09:55:54.438395 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-inventory-0\") pod \"7b59e5b4-c357-4fb5-8606-6a7fef1519cf\" (UID: \"7b59e5b4-c357-4fb5-8606-6a7fef1519cf\") " Jan 31 09:55:54 crc kubenswrapper[4992]: I0131 09:55:54.454630 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-kube-api-access-cwxnm" (OuterVolumeSpecName: "kube-api-access-cwxnm") pod "7b59e5b4-c357-4fb5-8606-6a7fef1519cf" (UID: "7b59e5b4-c357-4fb5-8606-6a7fef1519cf"). InnerVolumeSpecName "kube-api-access-cwxnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:55:54 crc kubenswrapper[4992]: I0131 09:55:54.524915 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "7b59e5b4-c357-4fb5-8606-6a7fef1519cf" (UID: "7b59e5b4-c357-4fb5-8606-6a7fef1519cf"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:55:54 crc kubenswrapper[4992]: I0131 09:55:54.541732 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwxnm\" (UniqueName: \"kubernetes.io/projected/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-kube-api-access-cwxnm\") on node \"crc\" DevicePath \"\"" Jan 31 09:55:54 crc kubenswrapper[4992]: I0131 09:55:54.541771 4992 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 31 09:55:54 crc kubenswrapper[4992]: I0131 09:55:54.566462 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7b59e5b4-c357-4fb5-8606-6a7fef1519cf" (UID: "7b59e5b4-c357-4fb5-8606-6a7fef1519cf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:55:54 crc kubenswrapper[4992]: I0131 09:55:54.643844 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b59e5b4-c357-4fb5-8606-6a7fef1519cf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:55:54 crc kubenswrapper[4992]: I0131 09:55:54.971916 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" event={"ID":"7b59e5b4-c357-4fb5-8606-6a7fef1519cf","Type":"ContainerDied","Data":"30324bca244eabc98870f520ff7164f10d85967902e7792e765ba9545f8406d3"} Jan 31 09:55:54 crc kubenswrapper[4992]: I0131 09:55:54.971954 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30324bca244eabc98870f520ff7164f10d85967902e7792e765ba9545f8406d3" Jan 31 09:55:54 crc kubenswrapper[4992]: I0131 09:55:54.972011 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fhwbp" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.065927 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl"] Jan 31 09:55:55 crc kubenswrapper[4992]: E0131 09:55:55.066382 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b59e5b4-c357-4fb5-8606-6a7fef1519cf" containerName="ssh-known-hosts-edpm-deployment" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.066432 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b59e5b4-c357-4fb5-8606-6a7fef1519cf" containerName="ssh-known-hosts-edpm-deployment" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.066634 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b59e5b4-c357-4fb5-8606-6a7fef1519cf" containerName="ssh-known-hosts-edpm-deployment" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.067287 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.072478 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.072666 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.072757 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.074308 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.087570 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl"] Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.153624 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fsmfl\" (UID: \"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.153768 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx4mh\" (UniqueName: \"kubernetes.io/projected/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-kube-api-access-vx4mh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fsmfl\" (UID: \"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.153836 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fsmfl\" (UID: \"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.256676 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fsmfl\" (UID: \"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.257313 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx4mh\" (UniqueName: \"kubernetes.io/projected/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-kube-api-access-vx4mh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fsmfl\" (UID: \"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.257528 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fsmfl\" (UID: \"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.276789 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx4mh\" (UniqueName: \"kubernetes.io/projected/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-kube-api-access-vx4mh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fsmfl\" (UID: \"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.279754 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fsmfl\" (UID: \"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.280481 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fsmfl\" (UID: \"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.386714 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.922855 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl"] Jan 31 09:55:55 crc kubenswrapper[4992]: I0131 09:55:55.981165 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" event={"ID":"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2","Type":"ContainerStarted","Data":"1376c31cf1a7a7ac6eb30b630d9a045841a2286c66f22b56f0c2d270158acdf3"} Jan 31 09:55:56 crc kubenswrapper[4992]: I0131 09:55:56.990049 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" event={"ID":"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2","Type":"ContainerStarted","Data":"bf2e5113db504f0f21f9e3c073b56506fb4f82095aad338c3460e57381cb9b5d"} Jan 31 09:55:57 crc kubenswrapper[4992]: I0131 09:55:57.009182 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" podStartSLOduration=1.497029266 podStartE2EDuration="2.009165508s" podCreationTimestamp="2026-01-31 09:55:55 +0000 UTC" firstStartedPulling="2026-01-31 09:55:55.929878633 +0000 UTC m=+1851.901270620" lastFinishedPulling="2026-01-31 09:55:56.442014865 +0000 UTC m=+1852.413406862" observedRunningTime="2026-01-31 09:55:57.005776931 +0000 UTC m=+1852.977168948" watchObservedRunningTime="2026-01-31 09:55:57.009165508 +0000 UTC m=+1852.980557495" Jan 31 09:55:57 crc kubenswrapper[4992]: I0131 09:55:57.187332 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:55:57 crc kubenswrapper[4992]: E0131 09:55:57.188021 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:55:58 crc kubenswrapper[4992]: I0131 09:55:58.053375 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-zxdvt"] Jan 31 09:55:58 crc kubenswrapper[4992]: I0131 09:55:58.104261 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-bd9b-account-create-update-6msfr"] Jan 31 09:55:58 crc kubenswrapper[4992]: I0131 09:55:58.116071 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-vlvb9"] Jan 31 09:55:58 crc kubenswrapper[4992]: I0131 09:55:58.122915 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-2ngx9"] Jan 31 09:55:58 crc kubenswrapper[4992]: I0131 09:55:58.130375 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-3ffc-account-create-update-jxhx5"] Jan 31 09:55:58 crc kubenswrapper[4992]: I0131 09:55:58.137862 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-zxdvt"] Jan 31 09:55:58 crc kubenswrapper[4992]: I0131 09:55:58.145302 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-3ffc-account-create-update-jxhx5"] Jan 31 09:55:58 crc kubenswrapper[4992]: I0131 09:55:58.153572 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-bd9b-account-create-update-6msfr"] Jan 31 09:55:58 crc kubenswrapper[4992]: I0131 09:55:58.163388 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-vlvb9"] Jan 31 09:55:58 crc kubenswrapper[4992]: I0131 09:55:58.171861 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-2ngx9"] Jan 31 09:55:59 crc kubenswrapper[4992]: I0131 09:55:59.029739 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-200a-account-create-update-5x5ln"] Jan 31 09:55:59 crc kubenswrapper[4992]: I0131 09:55:59.041158 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-200a-account-create-update-5x5ln"] Jan 31 09:55:59 crc kubenswrapper[4992]: I0131 09:55:59.192229 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2658f892-db13-4d57-96eb-dcf80264e0f7" path="/var/lib/kubelet/pods/2658f892-db13-4d57-96eb-dcf80264e0f7/volumes" Jan 31 09:55:59 crc kubenswrapper[4992]: I0131 09:55:59.192796 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d4b5e45-ca2f-4db6-add4-0b395981b5cd" path="/var/lib/kubelet/pods/4d4b5e45-ca2f-4db6-add4-0b395981b5cd/volumes" Jan 31 09:55:59 crc kubenswrapper[4992]: I0131 09:55:59.193288 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95c2c6d9-ec19-4eae-aece-08798fa4fc95" path="/var/lib/kubelet/pods/95c2c6d9-ec19-4eae-aece-08798fa4fc95/volumes" Jan 31 09:55:59 crc kubenswrapper[4992]: I0131 09:55:59.193869 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0" path="/var/lib/kubelet/pods/9b1cf4c0-06a8-470c-bdb8-7ab0989a94f0/volumes" Jan 31 09:55:59 crc kubenswrapper[4992]: I0131 09:55:59.194896 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6187e6e-51fe-4cb2-a042-afce69a45d6b" path="/var/lib/kubelet/pods/a6187e6e-51fe-4cb2-a042-afce69a45d6b/volumes" Jan 31 09:55:59 crc kubenswrapper[4992]: I0131 09:55:59.195381 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e24bf444-c444-4559-af1a-5bd38f2ce48d" path="/var/lib/kubelet/pods/e24bf444-c444-4559-af1a-5bd38f2ce48d/volumes" Jan 31 09:56:04 crc kubenswrapper[4992]: I0131 09:56:04.050680 4992 generic.go:334] "Generic (PLEG): container finished" podID="2113bb3b-f70a-4261-ab56-1b3ef0f91bc2" containerID="bf2e5113db504f0f21f9e3c073b56506fb4f82095aad338c3460e57381cb9b5d" exitCode=0 Jan 31 09:56:04 crc kubenswrapper[4992]: I0131 09:56:04.050742 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" event={"ID":"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2","Type":"ContainerDied","Data":"bf2e5113db504f0f21f9e3c073b56506fb4f82095aad338c3460e57381cb9b5d"} Jan 31 09:56:05 crc kubenswrapper[4992]: I0131 09:56:05.496230 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" Jan 31 09:56:05 crc kubenswrapper[4992]: I0131 09:56:05.541387 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-ssh-key-openstack-edpm-ipam\") pod \"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2\" (UID: \"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2\") " Jan 31 09:56:05 crc kubenswrapper[4992]: I0131 09:56:05.541547 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-inventory\") pod \"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2\" (UID: \"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2\") " Jan 31 09:56:05 crc kubenswrapper[4992]: I0131 09:56:05.541586 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx4mh\" (UniqueName: \"kubernetes.io/projected/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-kube-api-access-vx4mh\") pod \"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2\" (UID: \"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2\") " Jan 31 09:56:05 crc kubenswrapper[4992]: I0131 09:56:05.547909 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-kube-api-access-vx4mh" (OuterVolumeSpecName: "kube-api-access-vx4mh") pod "2113bb3b-f70a-4261-ab56-1b3ef0f91bc2" (UID: "2113bb3b-f70a-4261-ab56-1b3ef0f91bc2"). InnerVolumeSpecName "kube-api-access-vx4mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:56:05 crc kubenswrapper[4992]: I0131 09:56:05.572512 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-inventory" (OuterVolumeSpecName: "inventory") pod "2113bb3b-f70a-4261-ab56-1b3ef0f91bc2" (UID: "2113bb3b-f70a-4261-ab56-1b3ef0f91bc2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:56:05 crc kubenswrapper[4992]: I0131 09:56:05.574581 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2113bb3b-f70a-4261-ab56-1b3ef0f91bc2" (UID: "2113bb3b-f70a-4261-ab56-1b3ef0f91bc2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:56:05 crc kubenswrapper[4992]: I0131 09:56:05.643517 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:05 crc kubenswrapper[4992]: I0131 09:56:05.643558 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:05 crc kubenswrapper[4992]: I0131 09:56:05.643570 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx4mh\" (UniqueName: \"kubernetes.io/projected/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2-kube-api-access-vx4mh\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.077361 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" event={"ID":"2113bb3b-f70a-4261-ab56-1b3ef0f91bc2","Type":"ContainerDied","Data":"1376c31cf1a7a7ac6eb30b630d9a045841a2286c66f22b56f0c2d270158acdf3"} Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.077731 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1376c31cf1a7a7ac6eb30b630d9a045841a2286c66f22b56f0c2d270158acdf3" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.077494 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.174075 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl"] Jan 31 09:56:06 crc kubenswrapper[4992]: E0131 09:56:06.174814 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2113bb3b-f70a-4261-ab56-1b3ef0f91bc2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.174856 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="2113bb3b-f70a-4261-ab56-1b3ef0f91bc2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.175320 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="2113bb3b-f70a-4261-ab56-1b3ef0f91bc2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.178262 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.181040 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.181444 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.181459 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.181451 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.201544 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl"] Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.361104 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44179ae4-aae0-4d90-97b9-99bbe8905f33-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl\" (UID: \"44179ae4-aae0-4d90-97b9-99bbe8905f33\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.361200 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2pjn\" (UniqueName: \"kubernetes.io/projected/44179ae4-aae0-4d90-97b9-99bbe8905f33-kube-api-access-m2pjn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl\" (UID: \"44179ae4-aae0-4d90-97b9-99bbe8905f33\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.361235 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44179ae4-aae0-4d90-97b9-99bbe8905f33-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl\" (UID: \"44179ae4-aae0-4d90-97b9-99bbe8905f33\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.463886 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44179ae4-aae0-4d90-97b9-99bbe8905f33-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl\" (UID: \"44179ae4-aae0-4d90-97b9-99bbe8905f33\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.464021 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2pjn\" (UniqueName: \"kubernetes.io/projected/44179ae4-aae0-4d90-97b9-99bbe8905f33-kube-api-access-m2pjn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl\" (UID: \"44179ae4-aae0-4d90-97b9-99bbe8905f33\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.464087 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44179ae4-aae0-4d90-97b9-99bbe8905f33-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl\" (UID: \"44179ae4-aae0-4d90-97b9-99bbe8905f33\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.472000 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44179ae4-aae0-4d90-97b9-99bbe8905f33-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl\" (UID: \"44179ae4-aae0-4d90-97b9-99bbe8905f33\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.472131 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44179ae4-aae0-4d90-97b9-99bbe8905f33-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl\" (UID: \"44179ae4-aae0-4d90-97b9-99bbe8905f33\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.485787 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2pjn\" (UniqueName: \"kubernetes.io/projected/44179ae4-aae0-4d90-97b9-99bbe8905f33-kube-api-access-m2pjn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl\" (UID: \"44179ae4-aae0-4d90-97b9-99bbe8905f33\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.500205 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" Jan 31 09:56:06 crc kubenswrapper[4992]: I0131 09:56:06.991601 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl"] Jan 31 09:56:06 crc kubenswrapper[4992]: W0131 09:56:06.997064 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44179ae4_aae0_4d90_97b9_99bbe8905f33.slice/crio-9701568664af23cb6362495e591400c52bd2c1e329be9bd0e3e13811dbe892f6 WatchSource:0}: Error finding container 9701568664af23cb6362495e591400c52bd2c1e329be9bd0e3e13811dbe892f6: Status 404 returned error can't find the container with id 9701568664af23cb6362495e591400c52bd2c1e329be9bd0e3e13811dbe892f6 Jan 31 09:56:07 crc kubenswrapper[4992]: I0131 09:56:07.090254 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" event={"ID":"44179ae4-aae0-4d90-97b9-99bbe8905f33","Type":"ContainerStarted","Data":"9701568664af23cb6362495e591400c52bd2c1e329be9bd0e3e13811dbe892f6"} Jan 31 09:56:08 crc kubenswrapper[4992]: I0131 09:56:08.099990 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" event={"ID":"44179ae4-aae0-4d90-97b9-99bbe8905f33","Type":"ContainerStarted","Data":"a859d4ceb7fb070808aa6e6b52860fd54f86e2aa08d4142b351d95252242352c"} Jan 31 09:56:08 crc kubenswrapper[4992]: I0131 09:56:08.118980 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" podStartSLOduration=1.728529767 podStartE2EDuration="2.11896129s" podCreationTimestamp="2026-01-31 09:56:06 +0000 UTC" firstStartedPulling="2026-01-31 09:56:06.999117049 +0000 UTC m=+1862.970509036" lastFinishedPulling="2026-01-31 09:56:07.389548572 +0000 UTC m=+1863.360940559" observedRunningTime="2026-01-31 09:56:08.116043966 +0000 UTC m=+1864.087435963" watchObservedRunningTime="2026-01-31 09:56:08.11896129 +0000 UTC m=+1864.090353277" Jan 31 09:56:12 crc kubenswrapper[4992]: I0131 09:56:12.183005 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:56:12 crc kubenswrapper[4992]: E0131 09:56:12.184061 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:56:17 crc kubenswrapper[4992]: I0131 09:56:17.156088 4992 scope.go:117] "RemoveContainer" containerID="1001ba270c8abf485e4825fe15c22c6d60928c6894927fafaeb7880b867c4f06" Jan 31 09:56:17 crc kubenswrapper[4992]: I0131 09:56:17.171669 4992 generic.go:334] "Generic (PLEG): container finished" podID="44179ae4-aae0-4d90-97b9-99bbe8905f33" containerID="a859d4ceb7fb070808aa6e6b52860fd54f86e2aa08d4142b351d95252242352c" exitCode=0 Jan 31 09:56:17 crc kubenswrapper[4992]: I0131 09:56:17.171737 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" event={"ID":"44179ae4-aae0-4d90-97b9-99bbe8905f33","Type":"ContainerDied","Data":"a859d4ceb7fb070808aa6e6b52860fd54f86e2aa08d4142b351d95252242352c"} Jan 31 09:56:17 crc kubenswrapper[4992]: I0131 09:56:17.184469 4992 scope.go:117] "RemoveContainer" containerID="a2facb1456cb995404008ac2aa01f932fce3c63d74bff74cfe6cdd1fd9884e5a" Jan 31 09:56:17 crc kubenswrapper[4992]: I0131 09:56:17.219340 4992 scope.go:117] "RemoveContainer" containerID="e51a95646f638bd153f86790269eac27001728c4df8f42e79a04f66dea9f2a0e" Jan 31 09:56:17 crc kubenswrapper[4992]: I0131 09:56:17.258839 4992 scope.go:117] "RemoveContainer" containerID="8875fbb11a5cd52687d9aba7763c2524a322fbd6ebc2843102a147588ae3bb6d" Jan 31 09:56:17 crc kubenswrapper[4992]: I0131 09:56:17.301461 4992 scope.go:117] "RemoveContainer" containerID="824907cd7b737411a0241f2f59f7d02d5aa3728882f84a81163d3e098f2793ec" Jan 31 09:56:17 crc kubenswrapper[4992]: I0131 09:56:17.347302 4992 scope.go:117] "RemoveContainer" containerID="9cfc3078597d8fd6b6bf9b55f515da2027be558e8fcdc92345a0cb093ef49c5c" Jan 31 09:56:18 crc kubenswrapper[4992]: I0131 09:56:18.645651 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" Jan 31 09:56:18 crc kubenswrapper[4992]: I0131 09:56:18.686019 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2pjn\" (UniqueName: \"kubernetes.io/projected/44179ae4-aae0-4d90-97b9-99bbe8905f33-kube-api-access-m2pjn\") pod \"44179ae4-aae0-4d90-97b9-99bbe8905f33\" (UID: \"44179ae4-aae0-4d90-97b9-99bbe8905f33\") " Jan 31 09:56:18 crc kubenswrapper[4992]: I0131 09:56:18.686147 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44179ae4-aae0-4d90-97b9-99bbe8905f33-inventory\") pod \"44179ae4-aae0-4d90-97b9-99bbe8905f33\" (UID: \"44179ae4-aae0-4d90-97b9-99bbe8905f33\") " Jan 31 09:56:18 crc kubenswrapper[4992]: I0131 09:56:18.686198 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44179ae4-aae0-4d90-97b9-99bbe8905f33-ssh-key-openstack-edpm-ipam\") pod \"44179ae4-aae0-4d90-97b9-99bbe8905f33\" (UID: \"44179ae4-aae0-4d90-97b9-99bbe8905f33\") " Jan 31 09:56:18 crc kubenswrapper[4992]: I0131 09:56:18.700842 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44179ae4-aae0-4d90-97b9-99bbe8905f33-kube-api-access-m2pjn" (OuterVolumeSpecName: "kube-api-access-m2pjn") pod "44179ae4-aae0-4d90-97b9-99bbe8905f33" (UID: "44179ae4-aae0-4d90-97b9-99bbe8905f33"). InnerVolumeSpecName "kube-api-access-m2pjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:56:18 crc kubenswrapper[4992]: I0131 09:56:18.726955 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44179ae4-aae0-4d90-97b9-99bbe8905f33-inventory" (OuterVolumeSpecName: "inventory") pod "44179ae4-aae0-4d90-97b9-99bbe8905f33" (UID: "44179ae4-aae0-4d90-97b9-99bbe8905f33"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:56:18 crc kubenswrapper[4992]: I0131 09:56:18.729621 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44179ae4-aae0-4d90-97b9-99bbe8905f33-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "44179ae4-aae0-4d90-97b9-99bbe8905f33" (UID: "44179ae4-aae0-4d90-97b9-99bbe8905f33"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:56:18 crc kubenswrapper[4992]: I0131 09:56:18.789490 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44179ae4-aae0-4d90-97b9-99bbe8905f33-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:18 crc kubenswrapper[4992]: I0131 09:56:18.789643 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44179ae4-aae0-4d90-97b9-99bbe8905f33-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:18 crc kubenswrapper[4992]: I0131 09:56:18.789717 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2pjn\" (UniqueName: \"kubernetes.io/projected/44179ae4-aae0-4d90-97b9-99bbe8905f33-kube-api-access-m2pjn\") on node \"crc\" DevicePath \"\"" Jan 31 09:56:19 crc kubenswrapper[4992]: I0131 09:56:19.188592 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" Jan 31 09:56:19 crc kubenswrapper[4992]: I0131 09:56:19.192175 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl" event={"ID":"44179ae4-aae0-4d90-97b9-99bbe8905f33","Type":"ContainerDied","Data":"9701568664af23cb6362495e591400c52bd2c1e329be9bd0e3e13811dbe892f6"} Jan 31 09:56:19 crc kubenswrapper[4992]: I0131 09:56:19.192213 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9701568664af23cb6362495e591400c52bd2c1e329be9bd0e3e13811dbe892f6" Jan 31 09:56:27 crc kubenswrapper[4992]: I0131 09:56:27.182832 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:56:27 crc kubenswrapper[4992]: E0131 09:56:27.183620 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:56:36 crc kubenswrapper[4992]: I0131 09:56:36.050765 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6ftmf"] Jan 31 09:56:36 crc kubenswrapper[4992]: I0131 09:56:36.061480 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6ftmf"] Jan 31 09:56:37 crc kubenswrapper[4992]: I0131 09:56:37.192402 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cb0cadc-8ce2-4abd-8a60-461019fb6f6d" path="/var/lib/kubelet/pods/8cb0cadc-8ce2-4abd-8a60-461019fb6f6d/volumes" Jan 31 09:56:41 crc kubenswrapper[4992]: I0131 09:56:41.182739 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:56:41 crc kubenswrapper[4992]: E0131 09:56:41.183248 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 09:56:54 crc kubenswrapper[4992]: I0131 09:56:54.182792 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 09:56:54 crc kubenswrapper[4992]: I0131 09:56:54.506580 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"5eeeb29fa9dd8d5c28d0e393025561263faa39377b4ed1db1e9b4f1fd66917c5"} Jan 31 09:56:58 crc kubenswrapper[4992]: I0131 09:56:58.043386 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-dg67m"] Jan 31 09:56:58 crc kubenswrapper[4992]: I0131 09:56:58.053176 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hdjxg"] Jan 31 09:56:58 crc kubenswrapper[4992]: I0131 09:56:58.061299 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-dg67m"] Jan 31 09:56:58 crc kubenswrapper[4992]: I0131 09:56:58.068255 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hdjxg"] Jan 31 09:56:59 crc kubenswrapper[4992]: I0131 09:56:59.195602 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bd87507-7257-4df9-9690-0c9d8b9f7556" path="/var/lib/kubelet/pods/1bd87507-7257-4df9-9690-0c9d8b9f7556/volumes" Jan 31 09:56:59 crc kubenswrapper[4992]: I0131 09:56:59.196948 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d31a3cea-f382-4709-b605-b6474a9c722c" path="/var/lib/kubelet/pods/d31a3cea-f382-4709-b605-b6474a9c722c/volumes" Jan 31 09:57:17 crc kubenswrapper[4992]: I0131 09:57:17.459655 4992 scope.go:117] "RemoveContainer" containerID="4d8c08f9453f44e563ea5382a90b0b7278aed11e3ba17611cf2b7fd5fd09175f" Jan 31 09:57:17 crc kubenswrapper[4992]: I0131 09:57:17.505756 4992 scope.go:117] "RemoveContainer" containerID="7b66c990fb38769ee48c362c0adb66670154b9e1bf0caa1e4f4668b01b780c37" Jan 31 09:57:17 crc kubenswrapper[4992]: I0131 09:57:17.543904 4992 scope.go:117] "RemoveContainer" containerID="3dd7393a58500f1a270c541f212d0a7a9031ca20e9c9cb77f7528b5986633832" Jan 31 09:57:43 crc kubenswrapper[4992]: I0131 09:57:43.068949 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-zffb4"] Jan 31 09:57:43 crc kubenswrapper[4992]: I0131 09:57:43.078680 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-zffb4"] Jan 31 09:57:43 crc kubenswrapper[4992]: I0131 09:57:43.193569 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f340cf4-078d-4c29-819e-0e29fc2ff63b" path="/var/lib/kubelet/pods/2f340cf4-078d-4c29-819e-0e29fc2ff63b/volumes" Jan 31 09:58:17 crc kubenswrapper[4992]: I0131 09:58:17.644846 4992 scope.go:117] "RemoveContainer" containerID="d179f0c605b0e131a357af624807c1d9dd307cc6a6a12afe3fcb80c4d1e2deec" Jan 31 09:59:15 crc kubenswrapper[4992]: I0131 09:59:15.301327 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:59:15 crc kubenswrapper[4992]: I0131 09:59:15.302043 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:59:43 crc kubenswrapper[4992]: I0131 09:59:43.902988 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fw8tq"] Jan 31 09:59:43 crc kubenswrapper[4992]: E0131 09:59:43.904020 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44179ae4-aae0-4d90-97b9-99bbe8905f33" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:59:43 crc kubenswrapper[4992]: I0131 09:59:43.904039 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="44179ae4-aae0-4d90-97b9-99bbe8905f33" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:59:43 crc kubenswrapper[4992]: I0131 09:59:43.904240 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="44179ae4-aae0-4d90-97b9-99bbe8905f33" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 31 09:59:43 crc kubenswrapper[4992]: I0131 09:59:43.905884 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:43 crc kubenswrapper[4992]: I0131 09:59:43.916579 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fw8tq"] Jan 31 09:59:44 crc kubenswrapper[4992]: I0131 09:59:44.045647 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-catalog-content\") pod \"redhat-operators-fw8tq\" (UID: \"15de5c74-07f2-4673-96d4-fc1f1f2f44bc\") " pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:44 crc kubenswrapper[4992]: I0131 09:59:44.045798 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79x8g\" (UniqueName: \"kubernetes.io/projected/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-kube-api-access-79x8g\") pod \"redhat-operators-fw8tq\" (UID: \"15de5c74-07f2-4673-96d4-fc1f1f2f44bc\") " pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:44 crc kubenswrapper[4992]: I0131 09:59:44.045951 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-utilities\") pod \"redhat-operators-fw8tq\" (UID: \"15de5c74-07f2-4673-96d4-fc1f1f2f44bc\") " pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:44 crc kubenswrapper[4992]: I0131 09:59:44.147945 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79x8g\" (UniqueName: \"kubernetes.io/projected/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-kube-api-access-79x8g\") pod \"redhat-operators-fw8tq\" (UID: \"15de5c74-07f2-4673-96d4-fc1f1f2f44bc\") " pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:44 crc kubenswrapper[4992]: I0131 09:59:44.148017 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-utilities\") pod \"redhat-operators-fw8tq\" (UID: \"15de5c74-07f2-4673-96d4-fc1f1f2f44bc\") " pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:44 crc kubenswrapper[4992]: I0131 09:59:44.148101 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-catalog-content\") pod \"redhat-operators-fw8tq\" (UID: \"15de5c74-07f2-4673-96d4-fc1f1f2f44bc\") " pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:44 crc kubenswrapper[4992]: I0131 09:59:44.148647 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-catalog-content\") pod \"redhat-operators-fw8tq\" (UID: \"15de5c74-07f2-4673-96d4-fc1f1f2f44bc\") " pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:44 crc kubenswrapper[4992]: I0131 09:59:44.148741 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-utilities\") pod \"redhat-operators-fw8tq\" (UID: \"15de5c74-07f2-4673-96d4-fc1f1f2f44bc\") " pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:44 crc kubenswrapper[4992]: I0131 09:59:44.171856 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79x8g\" (UniqueName: \"kubernetes.io/projected/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-kube-api-access-79x8g\") pod \"redhat-operators-fw8tq\" (UID: \"15de5c74-07f2-4673-96d4-fc1f1f2f44bc\") " pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:44 crc kubenswrapper[4992]: I0131 09:59:44.225976 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:44 crc kubenswrapper[4992]: I0131 09:59:44.689639 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fw8tq"] Jan 31 09:59:45 crc kubenswrapper[4992]: I0131 09:59:45.223176 4992 generic.go:334] "Generic (PLEG): container finished" podID="15de5c74-07f2-4673-96d4-fc1f1f2f44bc" containerID="61954568a25fbd47ca79f1e2078e73fe7a615ec1d2f9e21fc02834e09464953c" exitCode=0 Jan 31 09:59:45 crc kubenswrapper[4992]: I0131 09:59:45.223301 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw8tq" event={"ID":"15de5c74-07f2-4673-96d4-fc1f1f2f44bc","Type":"ContainerDied","Data":"61954568a25fbd47ca79f1e2078e73fe7a615ec1d2f9e21fc02834e09464953c"} Jan 31 09:59:45 crc kubenswrapper[4992]: I0131 09:59:45.223521 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw8tq" event={"ID":"15de5c74-07f2-4673-96d4-fc1f1f2f44bc","Type":"ContainerStarted","Data":"125182995d6a883415357539ac2611515c7d8eea013caab9576aa8a3737d0f6e"} Jan 31 09:59:45 crc kubenswrapper[4992]: I0131 09:59:45.224808 4992 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:59:45 crc kubenswrapper[4992]: I0131 09:59:45.301060 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:59:45 crc kubenswrapper[4992]: I0131 09:59:45.301117 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:59:46 crc kubenswrapper[4992]: I0131 09:59:46.233259 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw8tq" event={"ID":"15de5c74-07f2-4673-96d4-fc1f1f2f44bc","Type":"ContainerStarted","Data":"6e7b4479c4fc7781f45e06fe93e046862105ff59d4042f56d558ea0dc50adac7"} Jan 31 09:59:47 crc kubenswrapper[4992]: I0131 09:59:47.243227 4992 generic.go:334] "Generic (PLEG): container finished" podID="15de5c74-07f2-4673-96d4-fc1f1f2f44bc" containerID="6e7b4479c4fc7781f45e06fe93e046862105ff59d4042f56d558ea0dc50adac7" exitCode=0 Jan 31 09:59:47 crc kubenswrapper[4992]: I0131 09:59:47.243281 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw8tq" event={"ID":"15de5c74-07f2-4673-96d4-fc1f1f2f44bc","Type":"ContainerDied","Data":"6e7b4479c4fc7781f45e06fe93e046862105ff59d4042f56d558ea0dc50adac7"} Jan 31 09:59:48 crc kubenswrapper[4992]: I0131 09:59:48.253324 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw8tq" event={"ID":"15de5c74-07f2-4673-96d4-fc1f1f2f44bc","Type":"ContainerStarted","Data":"2335aab0d160a3410c2dff2c7eda5f6be137b52fae0a86be70d7e6c2e1a09be8"} Jan 31 09:59:48 crc kubenswrapper[4992]: I0131 09:59:48.884235 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fw8tq" podStartSLOduration=3.397546484 podStartE2EDuration="5.884214602s" podCreationTimestamp="2026-01-31 09:59:43 +0000 UTC" firstStartedPulling="2026-01-31 09:59:45.224595854 +0000 UTC m=+2081.195987841" lastFinishedPulling="2026-01-31 09:59:47.711263932 +0000 UTC m=+2083.682655959" observedRunningTime="2026-01-31 09:59:48.276608543 +0000 UTC m=+2084.248000540" watchObservedRunningTime="2026-01-31 09:59:48.884214602 +0000 UTC m=+2084.855606589" Jan 31 09:59:48 crc kubenswrapper[4992]: I0131 09:59:48.884572 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-djsrs"] Jan 31 09:59:48 crc kubenswrapper[4992]: I0131 09:59:48.886820 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djsrs" Jan 31 09:59:48 crc kubenswrapper[4992]: I0131 09:59:48.909839 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djsrs"] Jan 31 09:59:49 crc kubenswrapper[4992]: I0131 09:59:49.051877 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35306efc-d8f0-481c-9fb0-f18570aa6e83-catalog-content\") pod \"community-operators-djsrs\" (UID: \"35306efc-d8f0-481c-9fb0-f18570aa6e83\") " pod="openshift-marketplace/community-operators-djsrs" Jan 31 09:59:49 crc kubenswrapper[4992]: I0131 09:59:49.052024 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35306efc-d8f0-481c-9fb0-f18570aa6e83-utilities\") pod \"community-operators-djsrs\" (UID: \"35306efc-d8f0-481c-9fb0-f18570aa6e83\") " pod="openshift-marketplace/community-operators-djsrs" Jan 31 09:59:49 crc kubenswrapper[4992]: I0131 09:59:49.052106 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsjxf\" (UniqueName: \"kubernetes.io/projected/35306efc-d8f0-481c-9fb0-f18570aa6e83-kube-api-access-xsjxf\") pod \"community-operators-djsrs\" (UID: \"35306efc-d8f0-481c-9fb0-f18570aa6e83\") " pod="openshift-marketplace/community-operators-djsrs" Jan 31 09:59:49 crc kubenswrapper[4992]: I0131 09:59:49.154320 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsjxf\" (UniqueName: \"kubernetes.io/projected/35306efc-d8f0-481c-9fb0-f18570aa6e83-kube-api-access-xsjxf\") pod \"community-operators-djsrs\" (UID: \"35306efc-d8f0-481c-9fb0-f18570aa6e83\") " pod="openshift-marketplace/community-operators-djsrs" Jan 31 09:59:49 crc kubenswrapper[4992]: I0131 09:59:49.154412 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35306efc-d8f0-481c-9fb0-f18570aa6e83-catalog-content\") pod \"community-operators-djsrs\" (UID: \"35306efc-d8f0-481c-9fb0-f18570aa6e83\") " pod="openshift-marketplace/community-operators-djsrs" Jan 31 09:59:49 crc kubenswrapper[4992]: I0131 09:59:49.154555 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35306efc-d8f0-481c-9fb0-f18570aa6e83-utilities\") pod \"community-operators-djsrs\" (UID: \"35306efc-d8f0-481c-9fb0-f18570aa6e83\") " pod="openshift-marketplace/community-operators-djsrs" Jan 31 09:59:49 crc kubenswrapper[4992]: I0131 09:59:49.155120 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35306efc-d8f0-481c-9fb0-f18570aa6e83-utilities\") pod \"community-operators-djsrs\" (UID: \"35306efc-d8f0-481c-9fb0-f18570aa6e83\") " pod="openshift-marketplace/community-operators-djsrs" Jan 31 09:59:49 crc kubenswrapper[4992]: I0131 09:59:49.155168 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35306efc-d8f0-481c-9fb0-f18570aa6e83-catalog-content\") pod \"community-operators-djsrs\" (UID: \"35306efc-d8f0-481c-9fb0-f18570aa6e83\") " pod="openshift-marketplace/community-operators-djsrs" Jan 31 09:59:49 crc kubenswrapper[4992]: I0131 09:59:49.185450 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsjxf\" (UniqueName: \"kubernetes.io/projected/35306efc-d8f0-481c-9fb0-f18570aa6e83-kube-api-access-xsjxf\") pod \"community-operators-djsrs\" (UID: \"35306efc-d8f0-481c-9fb0-f18570aa6e83\") " pod="openshift-marketplace/community-operators-djsrs" Jan 31 09:59:49 crc kubenswrapper[4992]: I0131 09:59:49.208818 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djsrs" Jan 31 09:59:49 crc kubenswrapper[4992]: I0131 09:59:49.846748 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djsrs"] Jan 31 09:59:50 crc kubenswrapper[4992]: I0131 09:59:50.278766 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djsrs" event={"ID":"35306efc-d8f0-481c-9fb0-f18570aa6e83","Type":"ContainerStarted","Data":"e97cbeb3b1e7f005f58dc9481509c0fb3e01bcb72ffe6df2d8f196f013ef58fe"} Jan 31 09:59:50 crc kubenswrapper[4992]: I0131 09:59:50.279285 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djsrs" event={"ID":"35306efc-d8f0-481c-9fb0-f18570aa6e83","Type":"ContainerStarted","Data":"2e52ab49290999d08ddadc9d1cbeca6271b37e77dfe9ee6610c8e59e88d22ccd"} Jan 31 09:59:52 crc kubenswrapper[4992]: I0131 09:59:52.300573 4992 generic.go:334] "Generic (PLEG): container finished" podID="35306efc-d8f0-481c-9fb0-f18570aa6e83" containerID="e97cbeb3b1e7f005f58dc9481509c0fb3e01bcb72ffe6df2d8f196f013ef58fe" exitCode=0 Jan 31 09:59:52 crc kubenswrapper[4992]: I0131 09:59:52.300844 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djsrs" event={"ID":"35306efc-d8f0-481c-9fb0-f18570aa6e83","Type":"ContainerDied","Data":"e97cbeb3b1e7f005f58dc9481509c0fb3e01bcb72ffe6df2d8f196f013ef58fe"} Jan 31 09:59:53 crc kubenswrapper[4992]: I0131 09:59:53.312789 4992 generic.go:334] "Generic (PLEG): container finished" podID="35306efc-d8f0-481c-9fb0-f18570aa6e83" containerID="36aa29339fec435a0069321e4d01a29e534005bc6a0dc01fb1f5f2a10d3c06fc" exitCode=0 Jan 31 09:59:53 crc kubenswrapper[4992]: I0131 09:59:53.313102 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djsrs" event={"ID":"35306efc-d8f0-481c-9fb0-f18570aa6e83","Type":"ContainerDied","Data":"36aa29339fec435a0069321e4d01a29e534005bc6a0dc01fb1f5f2a10d3c06fc"} Jan 31 09:59:54 crc kubenswrapper[4992]: I0131 09:59:54.226160 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:54 crc kubenswrapper[4992]: I0131 09:59:54.227350 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:54 crc kubenswrapper[4992]: I0131 09:59:54.277230 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:54 crc kubenswrapper[4992]: I0131 09:59:54.322207 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djsrs" event={"ID":"35306efc-d8f0-481c-9fb0-f18570aa6e83","Type":"ContainerStarted","Data":"d16f03df63defda142021810ea957a01330408c0a5e32f8ef4296bc4f1a8d437"} Jan 31 09:59:54 crc kubenswrapper[4992]: I0131 09:59:54.348352 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-djsrs" podStartSLOduration=4.820603155 podStartE2EDuration="6.348337389s" podCreationTimestamp="2026-01-31 09:59:48 +0000 UTC" firstStartedPulling="2026-01-31 09:59:52.303754681 +0000 UTC m=+2088.275146668" lastFinishedPulling="2026-01-31 09:59:53.831488915 +0000 UTC m=+2089.802880902" observedRunningTime="2026-01-31 09:59:54.341890313 +0000 UTC m=+2090.313282300" watchObservedRunningTime="2026-01-31 09:59:54.348337389 +0000 UTC m=+2090.319729376" Jan 31 09:59:54 crc kubenswrapper[4992]: I0131 09:59:54.376700 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:56 crc kubenswrapper[4992]: I0131 09:59:56.484890 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fw8tq"] Jan 31 09:59:56 crc kubenswrapper[4992]: I0131 09:59:56.485360 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fw8tq" podUID="15de5c74-07f2-4673-96d4-fc1f1f2f44bc" containerName="registry-server" containerID="cri-o://2335aab0d160a3410c2dff2c7eda5f6be137b52fae0a86be70d7e6c2e1a09be8" gracePeriod=2 Jan 31 09:59:57 crc kubenswrapper[4992]: I0131 09:59:57.345751 4992 generic.go:334] "Generic (PLEG): container finished" podID="15de5c74-07f2-4673-96d4-fc1f1f2f44bc" containerID="2335aab0d160a3410c2dff2c7eda5f6be137b52fae0a86be70d7e6c2e1a09be8" exitCode=0 Jan 31 09:59:57 crc kubenswrapper[4992]: I0131 09:59:57.345985 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw8tq" event={"ID":"15de5c74-07f2-4673-96d4-fc1f1f2f44bc","Type":"ContainerDied","Data":"2335aab0d160a3410c2dff2c7eda5f6be137b52fae0a86be70d7e6c2e1a09be8"} Jan 31 09:59:57 crc kubenswrapper[4992]: I0131 09:59:57.464090 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:57 crc kubenswrapper[4992]: I0131 09:59:57.617612 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-catalog-content\") pod \"15de5c74-07f2-4673-96d4-fc1f1f2f44bc\" (UID: \"15de5c74-07f2-4673-96d4-fc1f1f2f44bc\") " Jan 31 09:59:57 crc kubenswrapper[4992]: I0131 09:59:57.617713 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79x8g\" (UniqueName: \"kubernetes.io/projected/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-kube-api-access-79x8g\") pod \"15de5c74-07f2-4673-96d4-fc1f1f2f44bc\" (UID: \"15de5c74-07f2-4673-96d4-fc1f1f2f44bc\") " Jan 31 09:59:57 crc kubenswrapper[4992]: I0131 09:59:57.617972 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-utilities\") pod \"15de5c74-07f2-4673-96d4-fc1f1f2f44bc\" (UID: \"15de5c74-07f2-4673-96d4-fc1f1f2f44bc\") " Jan 31 09:59:57 crc kubenswrapper[4992]: I0131 09:59:57.619744 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-utilities" (OuterVolumeSpecName: "utilities") pod "15de5c74-07f2-4673-96d4-fc1f1f2f44bc" (UID: "15de5c74-07f2-4673-96d4-fc1f1f2f44bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:59:57 crc kubenswrapper[4992]: I0131 09:59:57.627463 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-kube-api-access-79x8g" (OuterVolumeSpecName: "kube-api-access-79x8g") pod "15de5c74-07f2-4673-96d4-fc1f1f2f44bc" (UID: "15de5c74-07f2-4673-96d4-fc1f1f2f44bc"). InnerVolumeSpecName "kube-api-access-79x8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:59:57 crc kubenswrapper[4992]: I0131 09:59:57.720144 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:59:57 crc kubenswrapper[4992]: I0131 09:59:57.720186 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79x8g\" (UniqueName: \"kubernetes.io/projected/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-kube-api-access-79x8g\") on node \"crc\" DevicePath \"\"" Jan 31 09:59:57 crc kubenswrapper[4992]: I0131 09:59:57.784633 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15de5c74-07f2-4673-96d4-fc1f1f2f44bc" (UID: "15de5c74-07f2-4673-96d4-fc1f1f2f44bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:59:57 crc kubenswrapper[4992]: I0131 09:59:57.822282 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15de5c74-07f2-4673-96d4-fc1f1f2f44bc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:59:58 crc kubenswrapper[4992]: I0131 09:59:58.355072 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw8tq" event={"ID":"15de5c74-07f2-4673-96d4-fc1f1f2f44bc","Type":"ContainerDied","Data":"125182995d6a883415357539ac2611515c7d8eea013caab9576aa8a3737d0f6e"} Jan 31 09:59:58 crc kubenswrapper[4992]: I0131 09:59:58.355140 4992 scope.go:117] "RemoveContainer" containerID="2335aab0d160a3410c2dff2c7eda5f6be137b52fae0a86be70d7e6c2e1a09be8" Jan 31 09:59:58 crc kubenswrapper[4992]: I0131 09:59:58.355334 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw8tq" Jan 31 09:59:58 crc kubenswrapper[4992]: I0131 09:59:58.379596 4992 scope.go:117] "RemoveContainer" containerID="6e7b4479c4fc7781f45e06fe93e046862105ff59d4042f56d558ea0dc50adac7" Jan 31 09:59:58 crc kubenswrapper[4992]: I0131 09:59:58.392158 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fw8tq"] Jan 31 09:59:58 crc kubenswrapper[4992]: I0131 09:59:58.411224 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fw8tq"] Jan 31 09:59:58 crc kubenswrapper[4992]: I0131 09:59:58.425460 4992 scope.go:117] "RemoveContainer" containerID="61954568a25fbd47ca79f1e2078e73fe7a615ec1d2f9e21fc02834e09464953c" Jan 31 09:59:59 crc kubenswrapper[4992]: I0131 09:59:59.192038 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15de5c74-07f2-4673-96d4-fc1f1f2f44bc" path="/var/lib/kubelet/pods/15de5c74-07f2-4673-96d4-fc1f1f2f44bc/volumes" Jan 31 09:59:59 crc kubenswrapper[4992]: I0131 09:59:59.209445 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-djsrs" Jan 31 09:59:59 crc kubenswrapper[4992]: I0131 09:59:59.209497 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-djsrs" Jan 31 09:59:59 crc kubenswrapper[4992]: I0131 09:59:59.272743 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-djsrs" Jan 31 09:59:59 crc kubenswrapper[4992]: I0131 09:59:59.414943 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-djsrs" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.142861 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t"] Jan 31 10:00:00 crc kubenswrapper[4992]: E0131 10:00:00.143248 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15de5c74-07f2-4673-96d4-fc1f1f2f44bc" containerName="extract-content" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.143265 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="15de5c74-07f2-4673-96d4-fc1f1f2f44bc" containerName="extract-content" Jan 31 10:00:00 crc kubenswrapper[4992]: E0131 10:00:00.143284 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15de5c74-07f2-4673-96d4-fc1f1f2f44bc" containerName="extract-utilities" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.143292 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="15de5c74-07f2-4673-96d4-fc1f1f2f44bc" containerName="extract-utilities" Jan 31 10:00:00 crc kubenswrapper[4992]: E0131 10:00:00.143304 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15de5c74-07f2-4673-96d4-fc1f1f2f44bc" containerName="registry-server" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.143310 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="15de5c74-07f2-4673-96d4-fc1f1f2f44bc" containerName="registry-server" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.143503 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="15de5c74-07f2-4673-96d4-fc1f1f2f44bc" containerName="registry-server" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.144052 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.148357 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.148574 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.160774 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t"] Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.275405 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51a15afd-fa57-4e10-acd5-ded126489dd8-secret-volume\") pod \"collect-profiles-29497560-hgm7t\" (UID: \"51a15afd-fa57-4e10-acd5-ded126489dd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.275512 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnn28\" (UniqueName: \"kubernetes.io/projected/51a15afd-fa57-4e10-acd5-ded126489dd8-kube-api-access-xnn28\") pod \"collect-profiles-29497560-hgm7t\" (UID: \"51a15afd-fa57-4e10-acd5-ded126489dd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.275971 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51a15afd-fa57-4e10-acd5-ded126489dd8-config-volume\") pod \"collect-profiles-29497560-hgm7t\" (UID: \"51a15afd-fa57-4e10-acd5-ded126489dd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.378011 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51a15afd-fa57-4e10-acd5-ded126489dd8-config-volume\") pod \"collect-profiles-29497560-hgm7t\" (UID: \"51a15afd-fa57-4e10-acd5-ded126489dd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.378136 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51a15afd-fa57-4e10-acd5-ded126489dd8-secret-volume\") pod \"collect-profiles-29497560-hgm7t\" (UID: \"51a15afd-fa57-4e10-acd5-ded126489dd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.378180 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnn28\" (UniqueName: \"kubernetes.io/projected/51a15afd-fa57-4e10-acd5-ded126489dd8-kube-api-access-xnn28\") pod \"collect-profiles-29497560-hgm7t\" (UID: \"51a15afd-fa57-4e10-acd5-ded126489dd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.379055 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51a15afd-fa57-4e10-acd5-ded126489dd8-config-volume\") pod \"collect-profiles-29497560-hgm7t\" (UID: \"51a15afd-fa57-4e10-acd5-ded126489dd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.388743 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51a15afd-fa57-4e10-acd5-ded126489dd8-secret-volume\") pod \"collect-profiles-29497560-hgm7t\" (UID: \"51a15afd-fa57-4e10-acd5-ded126489dd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.397778 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnn28\" (UniqueName: \"kubernetes.io/projected/51a15afd-fa57-4e10-acd5-ded126489dd8-kube-api-access-xnn28\") pod \"collect-profiles-29497560-hgm7t\" (UID: \"51a15afd-fa57-4e10-acd5-ded126489dd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.464713 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.879980 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-djsrs"] Jan 31 10:00:00 crc kubenswrapper[4992]: I0131 10:00:00.903877 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t"] Jan 31 10:00:00 crc kubenswrapper[4992]: W0131 10:00:00.907757 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a15afd_fa57_4e10_acd5_ded126489dd8.slice/crio-a1b295ad2b41a1e2cbc67e0121cdd02c30fbb6a17761a1528973419d685b394f WatchSource:0}: Error finding container a1b295ad2b41a1e2cbc67e0121cdd02c30fbb6a17761a1528973419d685b394f: Status 404 returned error can't find the container with id a1b295ad2b41a1e2cbc67e0121cdd02c30fbb6a17761a1528973419d685b394f Jan 31 10:00:01 crc kubenswrapper[4992]: I0131 10:00:01.384706 4992 generic.go:334] "Generic (PLEG): container finished" podID="51a15afd-fa57-4e10-acd5-ded126489dd8" containerID="2a93bdb077456c4713694d19462e24a2869278dfa0d950c21e387020802079e1" exitCode=0 Jan 31 10:00:01 crc kubenswrapper[4992]: I0131 10:00:01.384787 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" event={"ID":"51a15afd-fa57-4e10-acd5-ded126489dd8","Type":"ContainerDied","Data":"2a93bdb077456c4713694d19462e24a2869278dfa0d950c21e387020802079e1"} Jan 31 10:00:01 crc kubenswrapper[4992]: I0131 10:00:01.384844 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" event={"ID":"51a15afd-fa57-4e10-acd5-ded126489dd8","Type":"ContainerStarted","Data":"a1b295ad2b41a1e2cbc67e0121cdd02c30fbb6a17761a1528973419d685b394f"} Jan 31 10:00:01 crc kubenswrapper[4992]: I0131 10:00:01.384969 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-djsrs" podUID="35306efc-d8f0-481c-9fb0-f18570aa6e83" containerName="registry-server" containerID="cri-o://d16f03df63defda142021810ea957a01330408c0a5e32f8ef4296bc4f1a8d437" gracePeriod=2 Jan 31 10:00:01 crc kubenswrapper[4992]: I0131 10:00:01.929978 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djsrs" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.013779 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsjxf\" (UniqueName: \"kubernetes.io/projected/35306efc-d8f0-481c-9fb0-f18570aa6e83-kube-api-access-xsjxf\") pod \"35306efc-d8f0-481c-9fb0-f18570aa6e83\" (UID: \"35306efc-d8f0-481c-9fb0-f18570aa6e83\") " Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.013876 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35306efc-d8f0-481c-9fb0-f18570aa6e83-catalog-content\") pod \"35306efc-d8f0-481c-9fb0-f18570aa6e83\" (UID: \"35306efc-d8f0-481c-9fb0-f18570aa6e83\") " Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.013936 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35306efc-d8f0-481c-9fb0-f18570aa6e83-utilities\") pod \"35306efc-d8f0-481c-9fb0-f18570aa6e83\" (UID: \"35306efc-d8f0-481c-9fb0-f18570aa6e83\") " Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.015076 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35306efc-d8f0-481c-9fb0-f18570aa6e83-utilities" (OuterVolumeSpecName: "utilities") pod "35306efc-d8f0-481c-9fb0-f18570aa6e83" (UID: "35306efc-d8f0-481c-9fb0-f18570aa6e83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.025323 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35306efc-d8f0-481c-9fb0-f18570aa6e83-kube-api-access-xsjxf" (OuterVolumeSpecName: "kube-api-access-xsjxf") pod "35306efc-d8f0-481c-9fb0-f18570aa6e83" (UID: "35306efc-d8f0-481c-9fb0-f18570aa6e83"). InnerVolumeSpecName "kube-api-access-xsjxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.063220 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35306efc-d8f0-481c-9fb0-f18570aa6e83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35306efc-d8f0-481c-9fb0-f18570aa6e83" (UID: "35306efc-d8f0-481c-9fb0-f18570aa6e83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.116607 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35306efc-d8f0-481c-9fb0-f18570aa6e83-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.116647 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsjxf\" (UniqueName: \"kubernetes.io/projected/35306efc-d8f0-481c-9fb0-f18570aa6e83-kube-api-access-xsjxf\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.116662 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35306efc-d8f0-481c-9fb0-f18570aa6e83-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.395001 4992 generic.go:334] "Generic (PLEG): container finished" podID="35306efc-d8f0-481c-9fb0-f18570aa6e83" containerID="d16f03df63defda142021810ea957a01330408c0a5e32f8ef4296bc4f1a8d437" exitCode=0 Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.395097 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djsrs" event={"ID":"35306efc-d8f0-481c-9fb0-f18570aa6e83","Type":"ContainerDied","Data":"d16f03df63defda142021810ea957a01330408c0a5e32f8ef4296bc4f1a8d437"} Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.395151 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djsrs" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.395173 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djsrs" event={"ID":"35306efc-d8f0-481c-9fb0-f18570aa6e83","Type":"ContainerDied","Data":"2e52ab49290999d08ddadc9d1cbeca6271b37e77dfe9ee6610c8e59e88d22ccd"} Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.395202 4992 scope.go:117] "RemoveContainer" containerID="d16f03df63defda142021810ea957a01330408c0a5e32f8ef4296bc4f1a8d437" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.431801 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-djsrs"] Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.442188 4992 scope.go:117] "RemoveContainer" containerID="36aa29339fec435a0069321e4d01a29e534005bc6a0dc01fb1f5f2a10d3c06fc" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.451515 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-djsrs"] Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.470489 4992 scope.go:117] "RemoveContainer" containerID="e97cbeb3b1e7f005f58dc9481509c0fb3e01bcb72ffe6df2d8f196f013ef58fe" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.537372 4992 scope.go:117] "RemoveContainer" containerID="d16f03df63defda142021810ea957a01330408c0a5e32f8ef4296bc4f1a8d437" Jan 31 10:00:02 crc kubenswrapper[4992]: E0131 10:00:02.537937 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d16f03df63defda142021810ea957a01330408c0a5e32f8ef4296bc4f1a8d437\": container with ID starting with d16f03df63defda142021810ea957a01330408c0a5e32f8ef4296bc4f1a8d437 not found: ID does not exist" containerID="d16f03df63defda142021810ea957a01330408c0a5e32f8ef4296bc4f1a8d437" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.537988 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d16f03df63defda142021810ea957a01330408c0a5e32f8ef4296bc4f1a8d437"} err="failed to get container status \"d16f03df63defda142021810ea957a01330408c0a5e32f8ef4296bc4f1a8d437\": rpc error: code = NotFound desc = could not find container \"d16f03df63defda142021810ea957a01330408c0a5e32f8ef4296bc4f1a8d437\": container with ID starting with d16f03df63defda142021810ea957a01330408c0a5e32f8ef4296bc4f1a8d437 not found: ID does not exist" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.538023 4992 scope.go:117] "RemoveContainer" containerID="36aa29339fec435a0069321e4d01a29e534005bc6a0dc01fb1f5f2a10d3c06fc" Jan 31 10:00:02 crc kubenswrapper[4992]: E0131 10:00:02.553082 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36aa29339fec435a0069321e4d01a29e534005bc6a0dc01fb1f5f2a10d3c06fc\": container with ID starting with 36aa29339fec435a0069321e4d01a29e534005bc6a0dc01fb1f5f2a10d3c06fc not found: ID does not exist" containerID="36aa29339fec435a0069321e4d01a29e534005bc6a0dc01fb1f5f2a10d3c06fc" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.553134 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36aa29339fec435a0069321e4d01a29e534005bc6a0dc01fb1f5f2a10d3c06fc"} err="failed to get container status \"36aa29339fec435a0069321e4d01a29e534005bc6a0dc01fb1f5f2a10d3c06fc\": rpc error: code = NotFound desc = could not find container \"36aa29339fec435a0069321e4d01a29e534005bc6a0dc01fb1f5f2a10d3c06fc\": container with ID starting with 36aa29339fec435a0069321e4d01a29e534005bc6a0dc01fb1f5f2a10d3c06fc not found: ID does not exist" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.553164 4992 scope.go:117] "RemoveContainer" containerID="e97cbeb3b1e7f005f58dc9481509c0fb3e01bcb72ffe6df2d8f196f013ef58fe" Jan 31 10:00:02 crc kubenswrapper[4992]: E0131 10:00:02.553663 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e97cbeb3b1e7f005f58dc9481509c0fb3e01bcb72ffe6df2d8f196f013ef58fe\": container with ID starting with e97cbeb3b1e7f005f58dc9481509c0fb3e01bcb72ffe6df2d8f196f013ef58fe not found: ID does not exist" containerID="e97cbeb3b1e7f005f58dc9481509c0fb3e01bcb72ffe6df2d8f196f013ef58fe" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.553717 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97cbeb3b1e7f005f58dc9481509c0fb3e01bcb72ffe6df2d8f196f013ef58fe"} err="failed to get container status \"e97cbeb3b1e7f005f58dc9481509c0fb3e01bcb72ffe6df2d8f196f013ef58fe\": rpc error: code = NotFound desc = could not find container \"e97cbeb3b1e7f005f58dc9481509c0fb3e01bcb72ffe6df2d8f196f013ef58fe\": container with ID starting with e97cbeb3b1e7f005f58dc9481509c0fb3e01bcb72ffe6df2d8f196f013ef58fe not found: ID does not exist" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.752290 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.828601 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51a15afd-fa57-4e10-acd5-ded126489dd8-config-volume\") pod \"51a15afd-fa57-4e10-acd5-ded126489dd8\" (UID: \"51a15afd-fa57-4e10-acd5-ded126489dd8\") " Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.828685 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnn28\" (UniqueName: \"kubernetes.io/projected/51a15afd-fa57-4e10-acd5-ded126489dd8-kube-api-access-xnn28\") pod \"51a15afd-fa57-4e10-acd5-ded126489dd8\" (UID: \"51a15afd-fa57-4e10-acd5-ded126489dd8\") " Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.828879 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51a15afd-fa57-4e10-acd5-ded126489dd8-secret-volume\") pod \"51a15afd-fa57-4e10-acd5-ded126489dd8\" (UID: \"51a15afd-fa57-4e10-acd5-ded126489dd8\") " Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.829579 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51a15afd-fa57-4e10-acd5-ded126489dd8-config-volume" (OuterVolumeSpecName: "config-volume") pod "51a15afd-fa57-4e10-acd5-ded126489dd8" (UID: "51a15afd-fa57-4e10-acd5-ded126489dd8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.830057 4992 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51a15afd-fa57-4e10-acd5-ded126489dd8-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.846679 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a15afd-fa57-4e10-acd5-ded126489dd8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "51a15afd-fa57-4e10-acd5-ded126489dd8" (UID: "51a15afd-fa57-4e10-acd5-ded126489dd8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.846717 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a15afd-fa57-4e10-acd5-ded126489dd8-kube-api-access-xnn28" (OuterVolumeSpecName: "kube-api-access-xnn28") pod "51a15afd-fa57-4e10-acd5-ded126489dd8" (UID: "51a15afd-fa57-4e10-acd5-ded126489dd8"). InnerVolumeSpecName "kube-api-access-xnn28". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.931947 4992 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51a15afd-fa57-4e10-acd5-ded126489dd8-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:02 crc kubenswrapper[4992]: I0131 10:00:02.931995 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnn28\" (UniqueName: \"kubernetes.io/projected/51a15afd-fa57-4e10-acd5-ded126489dd8-kube-api-access-xnn28\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:03 crc kubenswrapper[4992]: I0131 10:00:03.195379 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35306efc-d8f0-481c-9fb0-f18570aa6e83" path="/var/lib/kubelet/pods/35306efc-d8f0-481c-9fb0-f18570aa6e83/volumes" Jan 31 10:00:03 crc kubenswrapper[4992]: I0131 10:00:03.405103 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" Jan 31 10:00:03 crc kubenswrapper[4992]: I0131 10:00:03.405136 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t" event={"ID":"51a15afd-fa57-4e10-acd5-ded126489dd8","Type":"ContainerDied","Data":"a1b295ad2b41a1e2cbc67e0121cdd02c30fbb6a17761a1528973419d685b394f"} Jan 31 10:00:03 crc kubenswrapper[4992]: I0131 10:00:03.405184 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1b295ad2b41a1e2cbc67e0121cdd02c30fbb6a17761a1528973419d685b394f" Jan 31 10:00:03 crc kubenswrapper[4992]: I0131 10:00:03.824828 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n"] Jan 31 10:00:03 crc kubenswrapper[4992]: I0131 10:00:03.832975 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497515-tqk6n"] Jan 31 10:00:05 crc kubenswrapper[4992]: I0131 10:00:05.200632 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65c59658-5ed8-4cef-b36d-2a1e44ec6976" path="/var/lib/kubelet/pods/65c59658-5ed8-4cef-b36d-2a1e44ec6976/volumes" Jan 31 10:00:15 crc kubenswrapper[4992]: I0131 10:00:15.300928 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:00:15 crc kubenswrapper[4992]: I0131 10:00:15.301436 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:00:15 crc kubenswrapper[4992]: I0131 10:00:15.301484 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 10:00:15 crc kubenswrapper[4992]: I0131 10:00:15.302091 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5eeeb29fa9dd8d5c28d0e393025561263faa39377b4ed1db1e9b4f1fd66917c5"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 10:00:15 crc kubenswrapper[4992]: I0131 10:00:15.302153 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://5eeeb29fa9dd8d5c28d0e393025561263faa39377b4ed1db1e9b4f1fd66917c5" gracePeriod=600 Jan 31 10:00:15 crc kubenswrapper[4992]: I0131 10:00:15.514001 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="5eeeb29fa9dd8d5c28d0e393025561263faa39377b4ed1db1e9b4f1fd66917c5" exitCode=0 Jan 31 10:00:15 crc kubenswrapper[4992]: I0131 10:00:15.514077 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"5eeeb29fa9dd8d5c28d0e393025561263faa39377b4ed1db1e9b4f1fd66917c5"} Jan 31 10:00:15 crc kubenswrapper[4992]: I0131 10:00:15.514360 4992 scope.go:117] "RemoveContainer" containerID="1e6ca919f6ef7d675bac29150308cc132d6f33069f2df7753fadb82ec95d05a9" Jan 31 10:00:16 crc kubenswrapper[4992]: I0131 10:00:16.523409 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81"} Jan 31 10:00:17 crc kubenswrapper[4992]: I0131 10:00:17.743213 4992 scope.go:117] "RemoveContainer" containerID="a5f07f12e53bc482271dbac8b5d3aec9e8654c73d41a2db7e4becbb382281eea" Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.807370 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.813901 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.821149 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.829203 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.836364 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.843262 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.850856 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-vq2bk"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.859075 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fhwbp"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.865058 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-69gz8"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.872755 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-x4vkd"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.879432 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-79nt8"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.886831 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n2l5z"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.893943 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.901366 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fsmfl"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.907753 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.914906 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fhwbp"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.921800 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-dwzdl"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.928989 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.936202 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhqlr"] Jan 31 10:00:20 crc kubenswrapper[4992]: I0131 10:00:20.946061 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-hmqpv"] Jan 31 10:00:21 crc kubenswrapper[4992]: I0131 10:00:21.199523 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e3d4682-1214-4cda-a6d4-07bd6fe3b816" path="/var/lib/kubelet/pods/1e3d4682-1214-4cda-a6d4-07bd6fe3b816/volumes" Jan 31 10:00:21 crc kubenswrapper[4992]: I0131 10:00:21.200762 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2113bb3b-f70a-4261-ab56-1b3ef0f91bc2" path="/var/lib/kubelet/pods/2113bb3b-f70a-4261-ab56-1b3ef0f91bc2/volumes" Jan 31 10:00:21 crc kubenswrapper[4992]: I0131 10:00:21.201938 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="361ac32c-fc8e-4a26-ac92-c64b5bba4ffd" path="/var/lib/kubelet/pods/361ac32c-fc8e-4a26-ac92-c64b5bba4ffd/volumes" Jan 31 10:00:21 crc kubenswrapper[4992]: I0131 10:00:21.203299 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44179ae4-aae0-4d90-97b9-99bbe8905f33" path="/var/lib/kubelet/pods/44179ae4-aae0-4d90-97b9-99bbe8905f33/volumes" Jan 31 10:00:21 crc kubenswrapper[4992]: I0131 10:00:21.205940 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="595068b2-328c-46b5-b5b1-da4d34af14b2" path="/var/lib/kubelet/pods/595068b2-328c-46b5-b5b1-da4d34af14b2/volumes" Jan 31 10:00:21 crc kubenswrapper[4992]: I0131 10:00:21.207249 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="627e3db2-cdce-499c-88b6-ec31436246c0" path="/var/lib/kubelet/pods/627e3db2-cdce-499c-88b6-ec31436246c0/volumes" Jan 31 10:00:21 crc kubenswrapper[4992]: I0131 10:00:21.208386 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="684b95fc-cf60-4200-84d5-e7024abd3534" path="/var/lib/kubelet/pods/684b95fc-cf60-4200-84d5-e7024abd3534/volumes" Jan 31 10:00:21 crc kubenswrapper[4992]: I0131 10:00:21.209779 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b59e5b4-c357-4fb5-8606-6a7fef1519cf" path="/var/lib/kubelet/pods/7b59e5b4-c357-4fb5-8606-6a7fef1519cf/volumes" Jan 31 10:00:21 crc kubenswrapper[4992]: I0131 10:00:21.210571 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d26e4222-9259-4d18-a67c-a3890117d486" path="/var/lib/kubelet/pods/d26e4222-9259-4d18-a67c-a3890117d486/volumes" Jan 31 10:00:21 crc kubenswrapper[4992]: I0131 10:00:21.211157 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e671e97e-21df-468c-8142-c4da5165814c" path="/var/lib/kubelet/pods/e671e97e-21df-468c-8142-c4da5165814c/volumes" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.033571 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm"] Jan 31 10:00:26 crc kubenswrapper[4992]: E0131 10:00:26.034571 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a15afd-fa57-4e10-acd5-ded126489dd8" containerName="collect-profiles" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.034589 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a15afd-fa57-4e10-acd5-ded126489dd8" containerName="collect-profiles" Jan 31 10:00:26 crc kubenswrapper[4992]: E0131 10:00:26.034622 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35306efc-d8f0-481c-9fb0-f18570aa6e83" containerName="registry-server" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.034630 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="35306efc-d8f0-481c-9fb0-f18570aa6e83" containerName="registry-server" Jan 31 10:00:26 crc kubenswrapper[4992]: E0131 10:00:26.034657 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35306efc-d8f0-481c-9fb0-f18570aa6e83" containerName="extract-content" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.034665 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="35306efc-d8f0-481c-9fb0-f18570aa6e83" containerName="extract-content" Jan 31 10:00:26 crc kubenswrapper[4992]: E0131 10:00:26.034683 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35306efc-d8f0-481c-9fb0-f18570aa6e83" containerName="extract-utilities" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.034691 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="35306efc-d8f0-481c-9fb0-f18570aa6e83" containerName="extract-utilities" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.034879 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a15afd-fa57-4e10-acd5-ded126489dd8" containerName="collect-profiles" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.034911 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="35306efc-d8f0-481c-9fb0-f18570aa6e83" containerName="registry-server" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.035716 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.037434 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.037696 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.038741 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.038954 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.043991 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm"] Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.047148 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.052200 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.052243 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.052295 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmkrk\" (UniqueName: \"kubernetes.io/projected/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-kube-api-access-bmkrk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.052323 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.052354 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.154245 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.154311 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.154387 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmkrk\" (UniqueName: \"kubernetes.io/projected/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-kube-api-access-bmkrk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.154459 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.154506 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.161302 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.161318 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.162701 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.164864 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.176097 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmkrk\" (UniqueName: \"kubernetes.io/projected/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-kube-api-access-bmkrk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.372781 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:26 crc kubenswrapper[4992]: I0131 10:00:26.937370 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm"] Jan 31 10:00:27 crc kubenswrapper[4992]: I0131 10:00:27.616657 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" event={"ID":"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5","Type":"ContainerStarted","Data":"0535c636c383231e50f25e3e2d4a65385fe4b6da329525a50e0cf133dcdf3c51"} Jan 31 10:00:27 crc kubenswrapper[4992]: I0131 10:00:27.616948 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" event={"ID":"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5","Type":"ContainerStarted","Data":"a54ec532013ac42d91bd311ec37220300d41138bb8e9ad2f2a077e97d67d316f"} Jan 31 10:00:27 crc kubenswrapper[4992]: I0131 10:00:27.632029 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" podStartSLOduration=1.228946402 podStartE2EDuration="1.632007266s" podCreationTimestamp="2026-01-31 10:00:26 +0000 UTC" firstStartedPulling="2026-01-31 10:00:26.934385553 +0000 UTC m=+2122.905777540" lastFinishedPulling="2026-01-31 10:00:27.337446417 +0000 UTC m=+2123.308838404" observedRunningTime="2026-01-31 10:00:27.630964766 +0000 UTC m=+2123.602356753" watchObservedRunningTime="2026-01-31 10:00:27.632007266 +0000 UTC m=+2123.603399253" Jan 31 10:00:40 crc kubenswrapper[4992]: I0131 10:00:40.729834 4992 generic.go:334] "Generic (PLEG): container finished" podID="a3b485df-8ea6-46a4-8ba0-75bd8139e5b5" containerID="0535c636c383231e50f25e3e2d4a65385fe4b6da329525a50e0cf133dcdf3c51" exitCode=0 Jan 31 10:00:40 crc kubenswrapper[4992]: I0131 10:00:40.729978 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" event={"ID":"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5","Type":"ContainerDied","Data":"0535c636c383231e50f25e3e2d4a65385fe4b6da329525a50e0cf133dcdf3c51"} Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.143671 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.269994 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-ceph\") pod \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.270354 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmkrk\" (UniqueName: \"kubernetes.io/projected/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-kube-api-access-bmkrk\") pod \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.270382 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-repo-setup-combined-ca-bundle\") pod \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.270467 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-inventory\") pod \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.270528 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-ssh-key-openstack-edpm-ipam\") pod \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\" (UID: \"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5\") " Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.276276 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-ceph" (OuterVolumeSpecName: "ceph") pod "a3b485df-8ea6-46a4-8ba0-75bd8139e5b5" (UID: "a3b485df-8ea6-46a4-8ba0-75bd8139e5b5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.277009 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-kube-api-access-bmkrk" (OuterVolumeSpecName: "kube-api-access-bmkrk") pod "a3b485df-8ea6-46a4-8ba0-75bd8139e5b5" (UID: "a3b485df-8ea6-46a4-8ba0-75bd8139e5b5"). InnerVolumeSpecName "kube-api-access-bmkrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.277576 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "a3b485df-8ea6-46a4-8ba0-75bd8139e5b5" (UID: "a3b485df-8ea6-46a4-8ba0-75bd8139e5b5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.307379 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a3b485df-8ea6-46a4-8ba0-75bd8139e5b5" (UID: "a3b485df-8ea6-46a4-8ba0-75bd8139e5b5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.310679 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-inventory" (OuterVolumeSpecName: "inventory") pod "a3b485df-8ea6-46a4-8ba0-75bd8139e5b5" (UID: "a3b485df-8ea6-46a4-8ba0-75bd8139e5b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.373665 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.373734 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmkrk\" (UniqueName: \"kubernetes.io/projected/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-kube-api-access-bmkrk\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.373746 4992 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.373758 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.373769 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a3b485df-8ea6-46a4-8ba0-75bd8139e5b5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.749989 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" event={"ID":"a3b485df-8ea6-46a4-8ba0-75bd8139e5b5","Type":"ContainerDied","Data":"a54ec532013ac42d91bd311ec37220300d41138bb8e9ad2f2a077e97d67d316f"} Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.750031 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a54ec532013ac42d91bd311ec37220300d41138bb8e9ad2f2a077e97d67d316f" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.750049 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.866114 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p"] Jan 31 10:00:42 crc kubenswrapper[4992]: E0131 10:00:42.866727 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b485df-8ea6-46a4-8ba0-75bd8139e5b5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.866757 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b485df-8ea6-46a4-8ba0-75bd8139e5b5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.866966 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3b485df-8ea6-46a4-8ba0-75bd8139e5b5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.867796 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.870230 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.871368 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.871455 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.871650 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.871963 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.888135 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxbz2\" (UniqueName: \"kubernetes.io/projected/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-kube-api-access-gxbz2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.888248 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.888286 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.888312 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p"] Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.888505 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.888604 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.990117 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.990441 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.990622 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxbz2\" (UniqueName: \"kubernetes.io/projected/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-kube-api-access-gxbz2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.990736 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.990823 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.994518 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:42 crc kubenswrapper[4992]: I0131 10:00:42.994845 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:43 crc kubenswrapper[4992]: I0131 10:00:43.004601 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:43 crc kubenswrapper[4992]: I0131 10:00:43.006376 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:43 crc kubenswrapper[4992]: I0131 10:00:43.008935 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxbz2\" (UniqueName: \"kubernetes.io/projected/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-kube-api-access-gxbz2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:43 crc kubenswrapper[4992]: I0131 10:00:43.189361 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:00:43 crc kubenswrapper[4992]: I0131 10:00:43.689498 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p"] Jan 31 10:00:43 crc kubenswrapper[4992]: W0131 10:00:43.695032 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9d6e849_d0a1_4943_b626_7c38e8ac6a11.slice/crio-14c9d51ac2fdbdf5428bc744927013fc8ea4473e2d14095f69708b8a340e47b7 WatchSource:0}: Error finding container 14c9d51ac2fdbdf5428bc744927013fc8ea4473e2d14095f69708b8a340e47b7: Status 404 returned error can't find the container with id 14c9d51ac2fdbdf5428bc744927013fc8ea4473e2d14095f69708b8a340e47b7 Jan 31 10:00:43 crc kubenswrapper[4992]: I0131 10:00:43.757688 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" event={"ID":"e9d6e849-d0a1-4943-b626-7c38e8ac6a11","Type":"ContainerStarted","Data":"14c9d51ac2fdbdf5428bc744927013fc8ea4473e2d14095f69708b8a340e47b7"} Jan 31 10:00:44 crc kubenswrapper[4992]: I0131 10:00:44.770394 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" event={"ID":"e9d6e849-d0a1-4943-b626-7c38e8ac6a11","Type":"ContainerStarted","Data":"f7d4e701aa4fafc3565a54b890b5298c403e92f93e007cd00855657084921acf"} Jan 31 10:00:44 crc kubenswrapper[4992]: I0131 10:00:44.800923 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" podStartSLOduration=2.401769278 podStartE2EDuration="2.800905852s" podCreationTimestamp="2026-01-31 10:00:42 +0000 UTC" firstStartedPulling="2026-01-31 10:00:43.69749172 +0000 UTC m=+2139.668883707" lastFinishedPulling="2026-01-31 10:00:44.096628294 +0000 UTC m=+2140.068020281" observedRunningTime="2026-01-31 10:00:44.791350358 +0000 UTC m=+2140.762742385" watchObservedRunningTime="2026-01-31 10:00:44.800905852 +0000 UTC m=+2140.772297849" Jan 31 10:00:46 crc kubenswrapper[4992]: I0131 10:00:46.650499 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jml68"] Jan 31 10:00:46 crc kubenswrapper[4992]: I0131 10:00:46.653048 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:00:46 crc kubenswrapper[4992]: I0131 10:00:46.666546 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0503634e-4b4f-462e-970b-8c0dad86309f-utilities\") pod \"redhat-marketplace-jml68\" (UID: \"0503634e-4b4f-462e-970b-8c0dad86309f\") " pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:00:46 crc kubenswrapper[4992]: I0131 10:00:46.666664 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0503634e-4b4f-462e-970b-8c0dad86309f-catalog-content\") pod \"redhat-marketplace-jml68\" (UID: \"0503634e-4b4f-462e-970b-8c0dad86309f\") " pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:00:46 crc kubenswrapper[4992]: I0131 10:00:46.666946 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddfnx\" (UniqueName: \"kubernetes.io/projected/0503634e-4b4f-462e-970b-8c0dad86309f-kube-api-access-ddfnx\") pod \"redhat-marketplace-jml68\" (UID: \"0503634e-4b4f-462e-970b-8c0dad86309f\") " pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:00:46 crc kubenswrapper[4992]: I0131 10:00:46.678206 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jml68"] Jan 31 10:00:46 crc kubenswrapper[4992]: I0131 10:00:46.768645 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0503634e-4b4f-462e-970b-8c0dad86309f-utilities\") pod \"redhat-marketplace-jml68\" (UID: \"0503634e-4b4f-462e-970b-8c0dad86309f\") " pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:00:46 crc kubenswrapper[4992]: I0131 10:00:46.769045 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0503634e-4b4f-462e-970b-8c0dad86309f-catalog-content\") pod \"redhat-marketplace-jml68\" (UID: \"0503634e-4b4f-462e-970b-8c0dad86309f\") " pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:00:46 crc kubenswrapper[4992]: I0131 10:00:46.769210 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddfnx\" (UniqueName: \"kubernetes.io/projected/0503634e-4b4f-462e-970b-8c0dad86309f-kube-api-access-ddfnx\") pod \"redhat-marketplace-jml68\" (UID: \"0503634e-4b4f-462e-970b-8c0dad86309f\") " pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:00:46 crc kubenswrapper[4992]: I0131 10:00:46.769840 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0503634e-4b4f-462e-970b-8c0dad86309f-catalog-content\") pod \"redhat-marketplace-jml68\" (UID: \"0503634e-4b4f-462e-970b-8c0dad86309f\") " pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:00:46 crc kubenswrapper[4992]: I0131 10:00:46.770159 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0503634e-4b4f-462e-970b-8c0dad86309f-utilities\") pod \"redhat-marketplace-jml68\" (UID: \"0503634e-4b4f-462e-970b-8c0dad86309f\") " pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:00:46 crc kubenswrapper[4992]: I0131 10:00:46.788790 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddfnx\" (UniqueName: \"kubernetes.io/projected/0503634e-4b4f-462e-970b-8c0dad86309f-kube-api-access-ddfnx\") pod \"redhat-marketplace-jml68\" (UID: \"0503634e-4b4f-462e-970b-8c0dad86309f\") " pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:00:46 crc kubenswrapper[4992]: I0131 10:00:46.985974 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:00:47 crc kubenswrapper[4992]: I0131 10:00:47.478104 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jml68"] Jan 31 10:00:47 crc kubenswrapper[4992]: W0131 10:00:47.484991 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0503634e_4b4f_462e_970b_8c0dad86309f.slice/crio-1aaf0fc42a6a27b0e0af6e51081431b707a212d30acb80a0605e974dacc4b31a WatchSource:0}: Error finding container 1aaf0fc42a6a27b0e0af6e51081431b707a212d30acb80a0605e974dacc4b31a: Status 404 returned error can't find the container with id 1aaf0fc42a6a27b0e0af6e51081431b707a212d30acb80a0605e974dacc4b31a Jan 31 10:00:47 crc kubenswrapper[4992]: I0131 10:00:47.801052 4992 generic.go:334] "Generic (PLEG): container finished" podID="0503634e-4b4f-462e-970b-8c0dad86309f" containerID="8c90bda36a45923385a00eefed9242d6c43dc97a7b3815e486cd87212c7f92e5" exitCode=0 Jan 31 10:00:47 crc kubenswrapper[4992]: I0131 10:00:47.801095 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jml68" event={"ID":"0503634e-4b4f-462e-970b-8c0dad86309f","Type":"ContainerDied","Data":"8c90bda36a45923385a00eefed9242d6c43dc97a7b3815e486cd87212c7f92e5"} Jan 31 10:00:47 crc kubenswrapper[4992]: I0131 10:00:47.801123 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jml68" event={"ID":"0503634e-4b4f-462e-970b-8c0dad86309f","Type":"ContainerStarted","Data":"1aaf0fc42a6a27b0e0af6e51081431b707a212d30acb80a0605e974dacc4b31a"} Jan 31 10:00:48 crc kubenswrapper[4992]: I0131 10:00:48.811785 4992 generic.go:334] "Generic (PLEG): container finished" podID="0503634e-4b4f-462e-970b-8c0dad86309f" containerID="08684821e7a59ea62fdf566d056d53a7d37a8b74b2ba2c263821321994633947" exitCode=0 Jan 31 10:00:48 crc kubenswrapper[4992]: I0131 10:00:48.811883 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jml68" event={"ID":"0503634e-4b4f-462e-970b-8c0dad86309f","Type":"ContainerDied","Data":"08684821e7a59ea62fdf566d056d53a7d37a8b74b2ba2c263821321994633947"} Jan 31 10:00:49 crc kubenswrapper[4992]: I0131 10:00:49.822989 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jml68" event={"ID":"0503634e-4b4f-462e-970b-8c0dad86309f","Type":"ContainerStarted","Data":"5910f874770f60f1d7b90341f52d3494121f25b71397f7199a64c44177f1315b"} Jan 31 10:00:49 crc kubenswrapper[4992]: I0131 10:00:49.848009 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jml68" podStartSLOduration=2.450935297 podStartE2EDuration="3.847992202s" podCreationTimestamp="2026-01-31 10:00:46 +0000 UTC" firstStartedPulling="2026-01-31 10:00:47.804739927 +0000 UTC m=+2143.776131914" lastFinishedPulling="2026-01-31 10:00:49.201796842 +0000 UTC m=+2145.173188819" observedRunningTime="2026-01-31 10:00:49.840225429 +0000 UTC m=+2145.811617426" watchObservedRunningTime="2026-01-31 10:00:49.847992202 +0000 UTC m=+2145.819384189" Jan 31 10:00:56 crc kubenswrapper[4992]: I0131 10:00:56.986466 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:00:56 crc kubenswrapper[4992]: I0131 10:00:56.987048 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:00:57 crc kubenswrapper[4992]: I0131 10:00:57.030535 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:00:57 crc kubenswrapper[4992]: I0131 10:00:57.923296 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:00:57 crc kubenswrapper[4992]: I0131 10:00:57.969166 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jml68"] Jan 31 10:00:59 crc kubenswrapper[4992]: I0131 10:00:59.899542 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jml68" podUID="0503634e-4b4f-462e-970b-8c0dad86309f" containerName="registry-server" containerID="cri-o://5910f874770f60f1d7b90341f52d3494121f25b71397f7199a64c44177f1315b" gracePeriod=2 Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.158927 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29497561-fzlr9"] Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.160389 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.180314 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29497561-fzlr9"] Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.257287 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-fernet-keys\") pod \"keystone-cron-29497561-fzlr9\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.257386 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-config-data\") pod \"keystone-cron-29497561-fzlr9\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.257854 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-combined-ca-bundle\") pod \"keystone-cron-29497561-fzlr9\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.257968 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-422qq\" (UniqueName: \"kubernetes.io/projected/d5371aa1-88ea-42d2-9630-731d03707bb7-kube-api-access-422qq\") pod \"keystone-cron-29497561-fzlr9\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.359742 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-combined-ca-bundle\") pod \"keystone-cron-29497561-fzlr9\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.359815 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-422qq\" (UniqueName: \"kubernetes.io/projected/d5371aa1-88ea-42d2-9630-731d03707bb7-kube-api-access-422qq\") pod \"keystone-cron-29497561-fzlr9\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.359869 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-fernet-keys\") pod \"keystone-cron-29497561-fzlr9\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.359934 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-config-data\") pod \"keystone-cron-29497561-fzlr9\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.372321 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-combined-ca-bundle\") pod \"keystone-cron-29497561-fzlr9\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.389712 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-fernet-keys\") pod \"keystone-cron-29497561-fzlr9\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.389932 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-config-data\") pod \"keystone-cron-29497561-fzlr9\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.416244 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-422qq\" (UniqueName: \"kubernetes.io/projected/d5371aa1-88ea-42d2-9630-731d03707bb7-kube-api-access-422qq\") pod \"keystone-cron-29497561-fzlr9\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.500227 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.507777 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.670410 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0503634e-4b4f-462e-970b-8c0dad86309f-utilities\") pod \"0503634e-4b4f-462e-970b-8c0dad86309f\" (UID: \"0503634e-4b4f-462e-970b-8c0dad86309f\") " Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.670772 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddfnx\" (UniqueName: \"kubernetes.io/projected/0503634e-4b4f-462e-970b-8c0dad86309f-kube-api-access-ddfnx\") pod \"0503634e-4b4f-462e-970b-8c0dad86309f\" (UID: \"0503634e-4b4f-462e-970b-8c0dad86309f\") " Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.670795 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0503634e-4b4f-462e-970b-8c0dad86309f-catalog-content\") pod \"0503634e-4b4f-462e-970b-8c0dad86309f\" (UID: \"0503634e-4b4f-462e-970b-8c0dad86309f\") " Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.671911 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0503634e-4b4f-462e-970b-8c0dad86309f-utilities" (OuterVolumeSpecName: "utilities") pod "0503634e-4b4f-462e-970b-8c0dad86309f" (UID: "0503634e-4b4f-462e-970b-8c0dad86309f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.675956 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0503634e-4b4f-462e-970b-8c0dad86309f-kube-api-access-ddfnx" (OuterVolumeSpecName: "kube-api-access-ddfnx") pod "0503634e-4b4f-462e-970b-8c0dad86309f" (UID: "0503634e-4b4f-462e-970b-8c0dad86309f"). InnerVolumeSpecName "kube-api-access-ddfnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.698569 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0503634e-4b4f-462e-970b-8c0dad86309f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0503634e-4b4f-462e-970b-8c0dad86309f" (UID: "0503634e-4b4f-462e-970b-8c0dad86309f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.773004 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0503634e-4b4f-462e-970b-8c0dad86309f-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.773033 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddfnx\" (UniqueName: \"kubernetes.io/projected/0503634e-4b4f-462e-970b-8c0dad86309f-kube-api-access-ddfnx\") on node \"crc\" DevicePath \"\"" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.773043 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0503634e-4b4f-462e-970b-8c0dad86309f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.909059 4992 generic.go:334] "Generic (PLEG): container finished" podID="0503634e-4b4f-462e-970b-8c0dad86309f" containerID="5910f874770f60f1d7b90341f52d3494121f25b71397f7199a64c44177f1315b" exitCode=0 Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.909112 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jml68" event={"ID":"0503634e-4b4f-462e-970b-8c0dad86309f","Type":"ContainerDied","Data":"5910f874770f60f1d7b90341f52d3494121f25b71397f7199a64c44177f1315b"} Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.909149 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jml68" event={"ID":"0503634e-4b4f-462e-970b-8c0dad86309f","Type":"ContainerDied","Data":"1aaf0fc42a6a27b0e0af6e51081431b707a212d30acb80a0605e974dacc4b31a"} Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.909153 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jml68" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.909169 4992 scope.go:117] "RemoveContainer" containerID="5910f874770f60f1d7b90341f52d3494121f25b71397f7199a64c44177f1315b" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.928903 4992 scope.go:117] "RemoveContainer" containerID="08684821e7a59ea62fdf566d056d53a7d37a8b74b2ba2c263821321994633947" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.947381 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jml68"] Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.955461 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jml68"] Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.963650 4992 scope.go:117] "RemoveContainer" containerID="8c90bda36a45923385a00eefed9242d6c43dc97a7b3815e486cd87212c7f92e5" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.987725 4992 scope.go:117] "RemoveContainer" containerID="5910f874770f60f1d7b90341f52d3494121f25b71397f7199a64c44177f1315b" Jan 31 10:01:00 crc kubenswrapper[4992]: E0131 10:01:00.988215 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5910f874770f60f1d7b90341f52d3494121f25b71397f7199a64c44177f1315b\": container with ID starting with 5910f874770f60f1d7b90341f52d3494121f25b71397f7199a64c44177f1315b not found: ID does not exist" containerID="5910f874770f60f1d7b90341f52d3494121f25b71397f7199a64c44177f1315b" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.988525 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5910f874770f60f1d7b90341f52d3494121f25b71397f7199a64c44177f1315b"} err="failed to get container status \"5910f874770f60f1d7b90341f52d3494121f25b71397f7199a64c44177f1315b\": rpc error: code = NotFound desc = could not find container \"5910f874770f60f1d7b90341f52d3494121f25b71397f7199a64c44177f1315b\": container with ID starting with 5910f874770f60f1d7b90341f52d3494121f25b71397f7199a64c44177f1315b not found: ID does not exist" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.988560 4992 scope.go:117] "RemoveContainer" containerID="08684821e7a59ea62fdf566d056d53a7d37a8b74b2ba2c263821321994633947" Jan 31 10:01:00 crc kubenswrapper[4992]: E0131 10:01:00.988835 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08684821e7a59ea62fdf566d056d53a7d37a8b74b2ba2c263821321994633947\": container with ID starting with 08684821e7a59ea62fdf566d056d53a7d37a8b74b2ba2c263821321994633947 not found: ID does not exist" containerID="08684821e7a59ea62fdf566d056d53a7d37a8b74b2ba2c263821321994633947" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.988861 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08684821e7a59ea62fdf566d056d53a7d37a8b74b2ba2c263821321994633947"} err="failed to get container status \"08684821e7a59ea62fdf566d056d53a7d37a8b74b2ba2c263821321994633947\": rpc error: code = NotFound desc = could not find container \"08684821e7a59ea62fdf566d056d53a7d37a8b74b2ba2c263821321994633947\": container with ID starting with 08684821e7a59ea62fdf566d056d53a7d37a8b74b2ba2c263821321994633947 not found: ID does not exist" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.988876 4992 scope.go:117] "RemoveContainer" containerID="8c90bda36a45923385a00eefed9242d6c43dc97a7b3815e486cd87212c7f92e5" Jan 31 10:01:00 crc kubenswrapper[4992]: E0131 10:01:00.989125 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c90bda36a45923385a00eefed9242d6c43dc97a7b3815e486cd87212c7f92e5\": container with ID starting with 8c90bda36a45923385a00eefed9242d6c43dc97a7b3815e486cd87212c7f92e5 not found: ID does not exist" containerID="8c90bda36a45923385a00eefed9242d6c43dc97a7b3815e486cd87212c7f92e5" Jan 31 10:01:00 crc kubenswrapper[4992]: I0131 10:01:00.989151 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c90bda36a45923385a00eefed9242d6c43dc97a7b3815e486cd87212c7f92e5"} err="failed to get container status \"8c90bda36a45923385a00eefed9242d6c43dc97a7b3815e486cd87212c7f92e5\": rpc error: code = NotFound desc = could not find container \"8c90bda36a45923385a00eefed9242d6c43dc97a7b3815e486cd87212c7f92e5\": container with ID starting with 8c90bda36a45923385a00eefed9242d6c43dc97a7b3815e486cd87212c7f92e5 not found: ID does not exist" Jan 31 10:01:01 crc kubenswrapper[4992]: I0131 10:01:01.008949 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29497561-fzlr9"] Jan 31 10:01:01 crc kubenswrapper[4992]: I0131 10:01:01.191720 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0503634e-4b4f-462e-970b-8c0dad86309f" path="/var/lib/kubelet/pods/0503634e-4b4f-462e-970b-8c0dad86309f/volumes" Jan 31 10:01:01 crc kubenswrapper[4992]: I0131 10:01:01.919043 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29497561-fzlr9" event={"ID":"d5371aa1-88ea-42d2-9630-731d03707bb7","Type":"ContainerStarted","Data":"72e905218f7754c35034e46727a57149ce0ffcc03fd4a6e01263024bfe4db4a3"} Jan 31 10:01:01 crc kubenswrapper[4992]: I0131 10:01:01.919401 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29497561-fzlr9" event={"ID":"d5371aa1-88ea-42d2-9630-731d03707bb7","Type":"ContainerStarted","Data":"468dc52e1627f7cd8046a5fc00570bd6687f55088aa5de30204af5b7eb1515a7"} Jan 31 10:01:01 crc kubenswrapper[4992]: I0131 10:01:01.938950 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29497561-fzlr9" podStartSLOduration=1.938932455 podStartE2EDuration="1.938932455s" podCreationTimestamp="2026-01-31 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 10:01:01.938019208 +0000 UTC m=+2157.909411245" watchObservedRunningTime="2026-01-31 10:01:01.938932455 +0000 UTC m=+2157.910324442" Jan 31 10:01:03 crc kubenswrapper[4992]: I0131 10:01:03.947768 4992 generic.go:334] "Generic (PLEG): container finished" podID="d5371aa1-88ea-42d2-9630-731d03707bb7" containerID="72e905218f7754c35034e46727a57149ce0ffcc03fd4a6e01263024bfe4db4a3" exitCode=0 Jan 31 10:01:03 crc kubenswrapper[4992]: I0131 10:01:03.947978 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29497561-fzlr9" event={"ID":"d5371aa1-88ea-42d2-9630-731d03707bb7","Type":"ContainerDied","Data":"72e905218f7754c35034e46727a57149ce0ffcc03fd4a6e01263024bfe4db4a3"} Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.291085 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.453746 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-combined-ca-bundle\") pod \"d5371aa1-88ea-42d2-9630-731d03707bb7\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.455517 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-config-data\") pod \"d5371aa1-88ea-42d2-9630-731d03707bb7\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.455664 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-422qq\" (UniqueName: \"kubernetes.io/projected/d5371aa1-88ea-42d2-9630-731d03707bb7-kube-api-access-422qq\") pod \"d5371aa1-88ea-42d2-9630-731d03707bb7\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.455747 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-fernet-keys\") pod \"d5371aa1-88ea-42d2-9630-731d03707bb7\" (UID: \"d5371aa1-88ea-42d2-9630-731d03707bb7\") " Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.461299 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d5371aa1-88ea-42d2-9630-731d03707bb7" (UID: "d5371aa1-88ea-42d2-9630-731d03707bb7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.463303 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5371aa1-88ea-42d2-9630-731d03707bb7-kube-api-access-422qq" (OuterVolumeSpecName: "kube-api-access-422qq") pod "d5371aa1-88ea-42d2-9630-731d03707bb7" (UID: "d5371aa1-88ea-42d2-9630-731d03707bb7"). InnerVolumeSpecName "kube-api-access-422qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.487543 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5371aa1-88ea-42d2-9630-731d03707bb7" (UID: "d5371aa1-88ea-42d2-9630-731d03707bb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.502122 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-config-data" (OuterVolumeSpecName: "config-data") pod "d5371aa1-88ea-42d2-9630-731d03707bb7" (UID: "d5371aa1-88ea-42d2-9630-731d03707bb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.559824 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-422qq\" (UniqueName: \"kubernetes.io/projected/d5371aa1-88ea-42d2-9630-731d03707bb7-kube-api-access-422qq\") on node \"crc\" DevicePath \"\"" Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.560466 4992 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.560627 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.560778 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5371aa1-88ea-42d2-9630-731d03707bb7-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.962332 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29497561-fzlr9" event={"ID":"d5371aa1-88ea-42d2-9630-731d03707bb7","Type":"ContainerDied","Data":"468dc52e1627f7cd8046a5fc00570bd6687f55088aa5de30204af5b7eb1515a7"} Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.962371 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="468dc52e1627f7cd8046a5fc00570bd6687f55088aa5de30204af5b7eb1515a7" Jan 31 10:01:05 crc kubenswrapper[4992]: I0131 10:01:05.962374 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29497561-fzlr9" Jan 31 10:01:18 crc kubenswrapper[4992]: I0131 10:01:18.236833 4992 scope.go:117] "RemoveContainer" containerID="5b476fc156520a50e29c812a048028fef35bc25e96080ff83975e220f1aca646" Jan 31 10:01:18 crc kubenswrapper[4992]: I0131 10:01:18.288913 4992 scope.go:117] "RemoveContainer" containerID="cad65664830e188a35e69070e0221ac64d8d22028229b989a10089f47b3d9b86" Jan 31 10:01:18 crc kubenswrapper[4992]: I0131 10:01:18.356867 4992 scope.go:117] "RemoveContainer" containerID="514a43dad636cc56e6082511525f06b2252abcac2c2788ab6cfb237aeb178c58" Jan 31 10:01:18 crc kubenswrapper[4992]: I0131 10:01:18.406566 4992 scope.go:117] "RemoveContainer" containerID="c9b93c36e46afdf9050b50c139c7f1ca101ea6375add174e8e8a2e8ab272a583" Jan 31 10:01:18 crc kubenswrapper[4992]: I0131 10:01:18.435519 4992 scope.go:117] "RemoveContainer" containerID="d81f5b770b5a441d90c574171ef6df6ba7de1a9b00d5aeb7e305ed5f73af49cb" Jan 31 10:01:18 crc kubenswrapper[4992]: I0131 10:01:18.481676 4992 scope.go:117] "RemoveContainer" containerID="cf3d1de6c588d3f54134231504b6eac26a013144b733bc5f1bee9ac6d84c4a80" Jan 31 10:01:18 crc kubenswrapper[4992]: I0131 10:01:18.525354 4992 scope.go:117] "RemoveContainer" containerID="4008cb617e1336397c426ef5b438104a645b45b525edeb75c7e58f2345165c77" Jan 31 10:02:04 crc kubenswrapper[4992]: I0131 10:02:04.906515 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vfm8w"] Jan 31 10:02:04 crc kubenswrapper[4992]: E0131 10:02:04.907608 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0503634e-4b4f-462e-970b-8c0dad86309f" containerName="registry-server" Jan 31 10:02:04 crc kubenswrapper[4992]: I0131 10:02:04.907631 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="0503634e-4b4f-462e-970b-8c0dad86309f" containerName="registry-server" Jan 31 10:02:04 crc kubenswrapper[4992]: E0131 10:02:04.907649 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0503634e-4b4f-462e-970b-8c0dad86309f" containerName="extract-utilities" Jan 31 10:02:04 crc kubenswrapper[4992]: I0131 10:02:04.907658 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="0503634e-4b4f-462e-970b-8c0dad86309f" containerName="extract-utilities" Jan 31 10:02:04 crc kubenswrapper[4992]: E0131 10:02:04.907690 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0503634e-4b4f-462e-970b-8c0dad86309f" containerName="extract-content" Jan 31 10:02:04 crc kubenswrapper[4992]: I0131 10:02:04.907699 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="0503634e-4b4f-462e-970b-8c0dad86309f" containerName="extract-content" Jan 31 10:02:04 crc kubenswrapper[4992]: E0131 10:02:04.907716 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5371aa1-88ea-42d2-9630-731d03707bb7" containerName="keystone-cron" Jan 31 10:02:04 crc kubenswrapper[4992]: I0131 10:02:04.907724 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5371aa1-88ea-42d2-9630-731d03707bb7" containerName="keystone-cron" Jan 31 10:02:04 crc kubenswrapper[4992]: I0131 10:02:04.907977 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5371aa1-88ea-42d2-9630-731d03707bb7" containerName="keystone-cron" Jan 31 10:02:04 crc kubenswrapper[4992]: I0131 10:02:04.907995 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="0503634e-4b4f-462e-970b-8c0dad86309f" containerName="registry-server" Jan 31 10:02:04 crc kubenswrapper[4992]: I0131 10:02:04.909706 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:04 crc kubenswrapper[4992]: I0131 10:02:04.917106 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vfm8w"] Jan 31 10:02:05 crc kubenswrapper[4992]: I0131 10:02:05.046396 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9874a4-2c06-4b33-aac8-2a3d893567ef-utilities\") pod \"certified-operators-vfm8w\" (UID: \"1a9874a4-2c06-4b33-aac8-2a3d893567ef\") " pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:05 crc kubenswrapper[4992]: I0131 10:02:05.046510 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9874a4-2c06-4b33-aac8-2a3d893567ef-catalog-content\") pod \"certified-operators-vfm8w\" (UID: \"1a9874a4-2c06-4b33-aac8-2a3d893567ef\") " pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:05 crc kubenswrapper[4992]: I0131 10:02:05.046539 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcwxh\" (UniqueName: \"kubernetes.io/projected/1a9874a4-2c06-4b33-aac8-2a3d893567ef-kube-api-access-vcwxh\") pod \"certified-operators-vfm8w\" (UID: \"1a9874a4-2c06-4b33-aac8-2a3d893567ef\") " pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:05 crc kubenswrapper[4992]: I0131 10:02:05.148912 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9874a4-2c06-4b33-aac8-2a3d893567ef-utilities\") pod \"certified-operators-vfm8w\" (UID: \"1a9874a4-2c06-4b33-aac8-2a3d893567ef\") " pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:05 crc kubenswrapper[4992]: I0131 10:02:05.149032 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9874a4-2c06-4b33-aac8-2a3d893567ef-catalog-content\") pod \"certified-operators-vfm8w\" (UID: \"1a9874a4-2c06-4b33-aac8-2a3d893567ef\") " pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:05 crc kubenswrapper[4992]: I0131 10:02:05.149070 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcwxh\" (UniqueName: \"kubernetes.io/projected/1a9874a4-2c06-4b33-aac8-2a3d893567ef-kube-api-access-vcwxh\") pod \"certified-operators-vfm8w\" (UID: \"1a9874a4-2c06-4b33-aac8-2a3d893567ef\") " pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:05 crc kubenswrapper[4992]: I0131 10:02:05.149375 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9874a4-2c06-4b33-aac8-2a3d893567ef-utilities\") pod \"certified-operators-vfm8w\" (UID: \"1a9874a4-2c06-4b33-aac8-2a3d893567ef\") " pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:05 crc kubenswrapper[4992]: I0131 10:02:05.149657 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9874a4-2c06-4b33-aac8-2a3d893567ef-catalog-content\") pod \"certified-operators-vfm8w\" (UID: \"1a9874a4-2c06-4b33-aac8-2a3d893567ef\") " pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:05 crc kubenswrapper[4992]: I0131 10:02:05.189728 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcwxh\" (UniqueName: \"kubernetes.io/projected/1a9874a4-2c06-4b33-aac8-2a3d893567ef-kube-api-access-vcwxh\") pod \"certified-operators-vfm8w\" (UID: \"1a9874a4-2c06-4b33-aac8-2a3d893567ef\") " pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:05 crc kubenswrapper[4992]: I0131 10:02:05.236332 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:05 crc kubenswrapper[4992]: I0131 10:02:05.766954 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vfm8w"] Jan 31 10:02:06 crc kubenswrapper[4992]: I0131 10:02:06.481547 4992 generic.go:334] "Generic (PLEG): container finished" podID="1a9874a4-2c06-4b33-aac8-2a3d893567ef" containerID="30e7a423c42a491f9e538b622f88b8a7a0686fe18ccb98bce91e6d6b0bd35a23" exitCode=0 Jan 31 10:02:06 crc kubenswrapper[4992]: I0131 10:02:06.481776 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfm8w" event={"ID":"1a9874a4-2c06-4b33-aac8-2a3d893567ef","Type":"ContainerDied","Data":"30e7a423c42a491f9e538b622f88b8a7a0686fe18ccb98bce91e6d6b0bd35a23"} Jan 31 10:02:06 crc kubenswrapper[4992]: I0131 10:02:06.481801 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfm8w" event={"ID":"1a9874a4-2c06-4b33-aac8-2a3d893567ef","Type":"ContainerStarted","Data":"7f509bc3201eaa7a2cecf3d92b53dd086188ab948c885b9f2aab9348972f59fb"} Jan 31 10:02:07 crc kubenswrapper[4992]: I0131 10:02:07.495249 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfm8w" event={"ID":"1a9874a4-2c06-4b33-aac8-2a3d893567ef","Type":"ContainerStarted","Data":"23eddd70d649bbd9088bbbe76287f4f6fb5151e8e754c02dc05b2e297b083a1b"} Jan 31 10:02:08 crc kubenswrapper[4992]: I0131 10:02:08.506604 4992 generic.go:334] "Generic (PLEG): container finished" podID="1a9874a4-2c06-4b33-aac8-2a3d893567ef" containerID="23eddd70d649bbd9088bbbe76287f4f6fb5151e8e754c02dc05b2e297b083a1b" exitCode=0 Jan 31 10:02:08 crc kubenswrapper[4992]: I0131 10:02:08.506661 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfm8w" event={"ID":"1a9874a4-2c06-4b33-aac8-2a3d893567ef","Type":"ContainerDied","Data":"23eddd70d649bbd9088bbbe76287f4f6fb5151e8e754c02dc05b2e297b083a1b"} Jan 31 10:02:09 crc kubenswrapper[4992]: I0131 10:02:09.516777 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfm8w" event={"ID":"1a9874a4-2c06-4b33-aac8-2a3d893567ef","Type":"ContainerStarted","Data":"0128ad262675ac22e20b45ca7fe93eef430e9cf5b55882ee00ce7bcfc84fef09"} Jan 31 10:02:09 crc kubenswrapper[4992]: I0131 10:02:09.532305 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vfm8w" podStartSLOduration=3.101880122 podStartE2EDuration="5.532287207s" podCreationTimestamp="2026-01-31 10:02:04 +0000 UTC" firstStartedPulling="2026-01-31 10:02:06.487861806 +0000 UTC m=+2222.459253813" lastFinishedPulling="2026-01-31 10:02:08.918268871 +0000 UTC m=+2224.889660898" observedRunningTime="2026-01-31 10:02:09.531122913 +0000 UTC m=+2225.502514900" watchObservedRunningTime="2026-01-31 10:02:09.532287207 +0000 UTC m=+2225.503679194" Jan 31 10:02:15 crc kubenswrapper[4992]: I0131 10:02:15.237206 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:15 crc kubenswrapper[4992]: I0131 10:02:15.238366 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:15 crc kubenswrapper[4992]: I0131 10:02:15.294503 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:15 crc kubenswrapper[4992]: I0131 10:02:15.300749 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:02:15 crc kubenswrapper[4992]: I0131 10:02:15.300797 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:02:15 crc kubenswrapper[4992]: I0131 10:02:15.621868 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:15 crc kubenswrapper[4992]: I0131 10:02:15.675076 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vfm8w"] Jan 31 10:02:17 crc kubenswrapper[4992]: I0131 10:02:17.586563 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vfm8w" podUID="1a9874a4-2c06-4b33-aac8-2a3d893567ef" containerName="registry-server" containerID="cri-o://0128ad262675ac22e20b45ca7fe93eef430e9cf5b55882ee00ce7bcfc84fef09" gracePeriod=2 Jan 31 10:02:18 crc kubenswrapper[4992]: I0131 10:02:18.601599 4992 generic.go:334] "Generic (PLEG): container finished" podID="1a9874a4-2c06-4b33-aac8-2a3d893567ef" containerID="0128ad262675ac22e20b45ca7fe93eef430e9cf5b55882ee00ce7bcfc84fef09" exitCode=0 Jan 31 10:02:18 crc kubenswrapper[4992]: I0131 10:02:18.601707 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfm8w" event={"ID":"1a9874a4-2c06-4b33-aac8-2a3d893567ef","Type":"ContainerDied","Data":"0128ad262675ac22e20b45ca7fe93eef430e9cf5b55882ee00ce7bcfc84fef09"} Jan 31 10:02:18 crc kubenswrapper[4992]: I0131 10:02:18.603808 4992 generic.go:334] "Generic (PLEG): container finished" podID="e9d6e849-d0a1-4943-b626-7c38e8ac6a11" containerID="f7d4e701aa4fafc3565a54b890b5298c403e92f93e007cd00855657084921acf" exitCode=0 Jan 31 10:02:18 crc kubenswrapper[4992]: I0131 10:02:18.603842 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" event={"ID":"e9d6e849-d0a1-4943-b626-7c38e8ac6a11","Type":"ContainerDied","Data":"f7d4e701aa4fafc3565a54b890b5298c403e92f93e007cd00855657084921acf"} Jan 31 10:02:18 crc kubenswrapper[4992]: I0131 10:02:18.722387 4992 scope.go:117] "RemoveContainer" containerID="ca35c71fd7ee98003c37eabae76a9ad3ba72586edc1c63d97114b17124ffd278" Jan 31 10:02:18 crc kubenswrapper[4992]: I0131 10:02:18.771309 4992 scope.go:117] "RemoveContainer" containerID="bf2e5113db504f0f21f9e3c073b56506fb4f82095aad338c3460e57381cb9b5d" Jan 31 10:02:18 crc kubenswrapper[4992]: I0131 10:02:18.796019 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:18 crc kubenswrapper[4992]: I0131 10:02:18.824473 4992 scope.go:117] "RemoveContainer" containerID="a859d4ceb7fb070808aa6e6b52860fd54f86e2aa08d4142b351d95252242352c" Jan 31 10:02:18 crc kubenswrapper[4992]: I0131 10:02:18.911138 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9874a4-2c06-4b33-aac8-2a3d893567ef-utilities\") pod \"1a9874a4-2c06-4b33-aac8-2a3d893567ef\" (UID: \"1a9874a4-2c06-4b33-aac8-2a3d893567ef\") " Jan 31 10:02:18 crc kubenswrapper[4992]: I0131 10:02:18.911305 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcwxh\" (UniqueName: \"kubernetes.io/projected/1a9874a4-2c06-4b33-aac8-2a3d893567ef-kube-api-access-vcwxh\") pod \"1a9874a4-2c06-4b33-aac8-2a3d893567ef\" (UID: \"1a9874a4-2c06-4b33-aac8-2a3d893567ef\") " Jan 31 10:02:18 crc kubenswrapper[4992]: I0131 10:02:18.911356 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9874a4-2c06-4b33-aac8-2a3d893567ef-catalog-content\") pod \"1a9874a4-2c06-4b33-aac8-2a3d893567ef\" (UID: \"1a9874a4-2c06-4b33-aac8-2a3d893567ef\") " Jan 31 10:02:18 crc kubenswrapper[4992]: I0131 10:02:18.912707 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a9874a4-2c06-4b33-aac8-2a3d893567ef-utilities" (OuterVolumeSpecName: "utilities") pod "1a9874a4-2c06-4b33-aac8-2a3d893567ef" (UID: "1a9874a4-2c06-4b33-aac8-2a3d893567ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:02:18 crc kubenswrapper[4992]: I0131 10:02:18.918071 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a9874a4-2c06-4b33-aac8-2a3d893567ef-kube-api-access-vcwxh" (OuterVolumeSpecName: "kube-api-access-vcwxh") pod "1a9874a4-2c06-4b33-aac8-2a3d893567ef" (UID: "1a9874a4-2c06-4b33-aac8-2a3d893567ef"). InnerVolumeSpecName "kube-api-access-vcwxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:02:18 crc kubenswrapper[4992]: I0131 10:02:18.972390 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a9874a4-2c06-4b33-aac8-2a3d893567ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a9874a4-2c06-4b33-aac8-2a3d893567ef" (UID: "1a9874a4-2c06-4b33-aac8-2a3d893567ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:02:19 crc kubenswrapper[4992]: I0131 10:02:19.013686 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a9874a4-2c06-4b33-aac8-2a3d893567ef-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:19 crc kubenswrapper[4992]: I0131 10:02:19.013729 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcwxh\" (UniqueName: \"kubernetes.io/projected/1a9874a4-2c06-4b33-aac8-2a3d893567ef-kube-api-access-vcwxh\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:19 crc kubenswrapper[4992]: I0131 10:02:19.013745 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a9874a4-2c06-4b33-aac8-2a3d893567ef-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:19 crc kubenswrapper[4992]: I0131 10:02:19.616495 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfm8w" event={"ID":"1a9874a4-2c06-4b33-aac8-2a3d893567ef","Type":"ContainerDied","Data":"7f509bc3201eaa7a2cecf3d92b53dd086188ab948c885b9f2aab9348972f59fb"} Jan 31 10:02:19 crc kubenswrapper[4992]: I0131 10:02:19.616545 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfm8w" Jan 31 10:02:19 crc kubenswrapper[4992]: I0131 10:02:19.616593 4992 scope.go:117] "RemoveContainer" containerID="0128ad262675ac22e20b45ca7fe93eef430e9cf5b55882ee00ce7bcfc84fef09" Jan 31 10:02:19 crc kubenswrapper[4992]: I0131 10:02:19.649516 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vfm8w"] Jan 31 10:02:19 crc kubenswrapper[4992]: I0131 10:02:19.655958 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vfm8w"] Jan 31 10:02:19 crc kubenswrapper[4992]: I0131 10:02:19.664269 4992 scope.go:117] "RemoveContainer" containerID="23eddd70d649bbd9088bbbe76287f4f6fb5151e8e754c02dc05b2e297b083a1b" Jan 31 10:02:19 crc kubenswrapper[4992]: I0131 10:02:19.689877 4992 scope.go:117] "RemoveContainer" containerID="30e7a423c42a491f9e538b622f88b8a7a0686fe18ccb98bce91e6d6b0bd35a23" Jan 31 10:02:19 crc kubenswrapper[4992]: I0131 10:02:19.994718 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.134751 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-inventory\") pod \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.135022 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-ceph\") pod \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.135110 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-ssh-key-openstack-edpm-ipam\") pod \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.135149 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxbz2\" (UniqueName: \"kubernetes.io/projected/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-kube-api-access-gxbz2\") pod \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.135202 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-bootstrap-combined-ca-bundle\") pod \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\" (UID: \"e9d6e849-d0a1-4943-b626-7c38e8ac6a11\") " Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.140363 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-ceph" (OuterVolumeSpecName: "ceph") pod "e9d6e849-d0a1-4943-b626-7c38e8ac6a11" (UID: "e9d6e849-d0a1-4943-b626-7c38e8ac6a11"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.140743 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e9d6e849-d0a1-4943-b626-7c38e8ac6a11" (UID: "e9d6e849-d0a1-4943-b626-7c38e8ac6a11"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.141551 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-kube-api-access-gxbz2" (OuterVolumeSpecName: "kube-api-access-gxbz2") pod "e9d6e849-d0a1-4943-b626-7c38e8ac6a11" (UID: "e9d6e849-d0a1-4943-b626-7c38e8ac6a11"). InnerVolumeSpecName "kube-api-access-gxbz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.159592 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e9d6e849-d0a1-4943-b626-7c38e8ac6a11" (UID: "e9d6e849-d0a1-4943-b626-7c38e8ac6a11"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.162608 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-inventory" (OuterVolumeSpecName: "inventory") pod "e9d6e849-d0a1-4943-b626-7c38e8ac6a11" (UID: "e9d6e849-d0a1-4943-b626-7c38e8ac6a11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.237577 4992 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.237737 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.237777 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.238106 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.238169 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxbz2\" (UniqueName: \"kubernetes.io/projected/e9d6e849-d0a1-4943-b626-7c38e8ac6a11-kube-api-access-gxbz2\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.624436 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.624452 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p" event={"ID":"e9d6e849-d0a1-4943-b626-7c38e8ac6a11","Type":"ContainerDied","Data":"14c9d51ac2fdbdf5428bc744927013fc8ea4473e2d14095f69708b8a340e47b7"} Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.624492 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14c9d51ac2fdbdf5428bc744927013fc8ea4473e2d14095f69708b8a340e47b7" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.719826 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw"] Jan 31 10:02:20 crc kubenswrapper[4992]: E0131 10:02:20.720267 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9874a4-2c06-4b33-aac8-2a3d893567ef" containerName="registry-server" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.720289 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9874a4-2c06-4b33-aac8-2a3d893567ef" containerName="registry-server" Jan 31 10:02:20 crc kubenswrapper[4992]: E0131 10:02:20.720313 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9874a4-2c06-4b33-aac8-2a3d893567ef" containerName="extract-content" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.720321 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9874a4-2c06-4b33-aac8-2a3d893567ef" containerName="extract-content" Jan 31 10:02:20 crc kubenswrapper[4992]: E0131 10:02:20.720360 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d6e849-d0a1-4943-b626-7c38e8ac6a11" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.720370 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d6e849-d0a1-4943-b626-7c38e8ac6a11" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 31 10:02:20 crc kubenswrapper[4992]: E0131 10:02:20.720385 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9874a4-2c06-4b33-aac8-2a3d893567ef" containerName="extract-utilities" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.720392 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9874a4-2c06-4b33-aac8-2a3d893567ef" containerName="extract-utilities" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.720605 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9874a4-2c06-4b33-aac8-2a3d893567ef" containerName="registry-server" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.720641 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9d6e849-d0a1-4943-b626-7c38e8ac6a11" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.721360 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.725661 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.725905 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.725998 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.725926 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.726302 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.737800 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw"] Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.853739 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-547tw\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.854088 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-547tw\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.854256 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-547tw\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.854288 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7jwh\" (UniqueName: \"kubernetes.io/projected/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-kube-api-access-j7jwh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-547tw\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.956442 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-547tw\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.956514 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-547tw\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.956688 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-547tw\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.956732 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7jwh\" (UniqueName: \"kubernetes.io/projected/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-kube-api-access-j7jwh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-547tw\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.961522 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-547tw\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.963266 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-547tw\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.968023 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-547tw\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:20 crc kubenswrapper[4992]: I0131 10:02:20.987700 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7jwh\" (UniqueName: \"kubernetes.io/projected/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-kube-api-access-j7jwh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-547tw\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:21 crc kubenswrapper[4992]: I0131 10:02:21.042675 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:21 crc kubenswrapper[4992]: I0131 10:02:21.200797 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a9874a4-2c06-4b33-aac8-2a3d893567ef" path="/var/lib/kubelet/pods/1a9874a4-2c06-4b33-aac8-2a3d893567ef/volumes" Jan 31 10:02:21 crc kubenswrapper[4992]: I0131 10:02:21.548477 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw"] Jan 31 10:02:21 crc kubenswrapper[4992]: W0131 10:02:21.557688 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ba2e7f8_2510_48bd_9597_8f25bde3aed5.slice/crio-0128e30fd018c8aaf2f87feffb50975c8804eba078610e9ae32c66027a5bd041 WatchSource:0}: Error finding container 0128e30fd018c8aaf2f87feffb50975c8804eba078610e9ae32c66027a5bd041: Status 404 returned error can't find the container with id 0128e30fd018c8aaf2f87feffb50975c8804eba078610e9ae32c66027a5bd041 Jan 31 10:02:21 crc kubenswrapper[4992]: I0131 10:02:21.633341 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" event={"ID":"8ba2e7f8-2510-48bd-9597-8f25bde3aed5","Type":"ContainerStarted","Data":"0128e30fd018c8aaf2f87feffb50975c8804eba078610e9ae32c66027a5bd041"} Jan 31 10:02:22 crc kubenswrapper[4992]: I0131 10:02:22.644579 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" event={"ID":"8ba2e7f8-2510-48bd-9597-8f25bde3aed5","Type":"ContainerStarted","Data":"6f46e309b15028b1a4a8e2786fc26b82f56a2145b2a79de298e5bb6d83af2d04"} Jan 31 10:02:22 crc kubenswrapper[4992]: I0131 10:02:22.671211 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" podStartSLOduration=2.213182353 podStartE2EDuration="2.671178477s" podCreationTimestamp="2026-01-31 10:02:20 +0000 UTC" firstStartedPulling="2026-01-31 10:02:21.560005513 +0000 UTC m=+2237.531397500" lastFinishedPulling="2026-01-31 10:02:22.018001597 +0000 UTC m=+2237.989393624" observedRunningTime="2026-01-31 10:02:22.669059466 +0000 UTC m=+2238.640451493" watchObservedRunningTime="2026-01-31 10:02:22.671178477 +0000 UTC m=+2238.642570494" Jan 31 10:02:44 crc kubenswrapper[4992]: I0131 10:02:44.843869 4992 generic.go:334] "Generic (PLEG): container finished" podID="8ba2e7f8-2510-48bd-9597-8f25bde3aed5" containerID="6f46e309b15028b1a4a8e2786fc26b82f56a2145b2a79de298e5bb6d83af2d04" exitCode=0 Jan 31 10:02:44 crc kubenswrapper[4992]: I0131 10:02:44.843946 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" event={"ID":"8ba2e7f8-2510-48bd-9597-8f25bde3aed5","Type":"ContainerDied","Data":"6f46e309b15028b1a4a8e2786fc26b82f56a2145b2a79de298e5bb6d83af2d04"} Jan 31 10:02:45 crc kubenswrapper[4992]: I0131 10:02:45.301312 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:02:45 crc kubenswrapper[4992]: I0131 10:02:45.301636 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.264637 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.456405 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7jwh\" (UniqueName: \"kubernetes.io/projected/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-kube-api-access-j7jwh\") pod \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.456473 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-inventory\") pod \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.456503 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-ssh-key-openstack-edpm-ipam\") pod \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.456578 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-ceph\") pod \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\" (UID: \"8ba2e7f8-2510-48bd-9597-8f25bde3aed5\") " Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.462674 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-kube-api-access-j7jwh" (OuterVolumeSpecName: "kube-api-access-j7jwh") pod "8ba2e7f8-2510-48bd-9597-8f25bde3aed5" (UID: "8ba2e7f8-2510-48bd-9597-8f25bde3aed5"). InnerVolumeSpecName "kube-api-access-j7jwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.465754 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-ceph" (OuterVolumeSpecName: "ceph") pod "8ba2e7f8-2510-48bd-9597-8f25bde3aed5" (UID: "8ba2e7f8-2510-48bd-9597-8f25bde3aed5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.482307 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-inventory" (OuterVolumeSpecName: "inventory") pod "8ba2e7f8-2510-48bd-9597-8f25bde3aed5" (UID: "8ba2e7f8-2510-48bd-9597-8f25bde3aed5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.484712 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8ba2e7f8-2510-48bd-9597-8f25bde3aed5" (UID: "8ba2e7f8-2510-48bd-9597-8f25bde3aed5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.558494 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7jwh\" (UniqueName: \"kubernetes.io/projected/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-kube-api-access-j7jwh\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.558526 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.558540 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.558552 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8ba2e7f8-2510-48bd-9597-8f25bde3aed5-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.873773 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" event={"ID":"8ba2e7f8-2510-48bd-9597-8f25bde3aed5","Type":"ContainerDied","Data":"0128e30fd018c8aaf2f87feffb50975c8804eba078610e9ae32c66027a5bd041"} Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.874087 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0128e30fd018c8aaf2f87feffb50975c8804eba078610e9ae32c66027a5bd041" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.873856 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-547tw" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.965228 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r"] Jan 31 10:02:46 crc kubenswrapper[4992]: E0131 10:02:46.965725 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba2e7f8-2510-48bd-9597-8f25bde3aed5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.965748 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba2e7f8-2510-48bd-9597-8f25bde3aed5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.965935 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba2e7f8-2510-48bd-9597-8f25bde3aed5" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.966685 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.968876 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.969168 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.969308 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.969441 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.971104 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:02:46 crc kubenswrapper[4992]: I0131 10:02:46.997006 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r"] Jan 31 10:02:47 crc kubenswrapper[4992]: I0131 10:02:47.169554 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dh72r\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:47 crc kubenswrapper[4992]: I0131 10:02:47.169657 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dh72r\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:47 crc kubenswrapper[4992]: I0131 10:02:47.169776 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7frh\" (UniqueName: \"kubernetes.io/projected/6b39a385-5883-4055-99cf-3c82edc683d6-kube-api-access-x7frh\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dh72r\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:47 crc kubenswrapper[4992]: I0131 10:02:47.169922 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dh72r\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:47 crc kubenswrapper[4992]: I0131 10:02:47.272015 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dh72r\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:47 crc kubenswrapper[4992]: I0131 10:02:47.272186 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7frh\" (UniqueName: \"kubernetes.io/projected/6b39a385-5883-4055-99cf-3c82edc683d6-kube-api-access-x7frh\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dh72r\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:47 crc kubenswrapper[4992]: I0131 10:02:47.272391 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dh72r\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:47 crc kubenswrapper[4992]: I0131 10:02:47.272979 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dh72r\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:47 crc kubenswrapper[4992]: I0131 10:02:47.279749 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dh72r\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:47 crc kubenswrapper[4992]: I0131 10:02:47.279826 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dh72r\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:47 crc kubenswrapper[4992]: I0131 10:02:47.281175 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dh72r\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:47 crc kubenswrapper[4992]: I0131 10:02:47.296702 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7frh\" (UniqueName: \"kubernetes.io/projected/6b39a385-5883-4055-99cf-3c82edc683d6-kube-api-access-x7frh\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dh72r\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:47 crc kubenswrapper[4992]: I0131 10:02:47.587663 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:47 crc kubenswrapper[4992]: I0131 10:02:47.899530 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r"] Jan 31 10:02:47 crc kubenswrapper[4992]: W0131 10:02:47.902232 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b39a385_5883_4055_99cf_3c82edc683d6.slice/crio-17866b878fea0ac871d6b70f9a951876405e24d4cb81d936201a52102a78e6ae WatchSource:0}: Error finding container 17866b878fea0ac871d6b70f9a951876405e24d4cb81d936201a52102a78e6ae: Status 404 returned error can't find the container with id 17866b878fea0ac871d6b70f9a951876405e24d4cb81d936201a52102a78e6ae Jan 31 10:02:48 crc kubenswrapper[4992]: I0131 10:02:48.894026 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" event={"ID":"6b39a385-5883-4055-99cf-3c82edc683d6","Type":"ContainerStarted","Data":"a7d79cde95997c3d1e85cccfa590db1bf1194e595141f9ea1f5628ca1b017aa6"} Jan 31 10:02:48 crc kubenswrapper[4992]: I0131 10:02:48.894350 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" event={"ID":"6b39a385-5883-4055-99cf-3c82edc683d6","Type":"ContainerStarted","Data":"17866b878fea0ac871d6b70f9a951876405e24d4cb81d936201a52102a78e6ae"} Jan 31 10:02:48 crc kubenswrapper[4992]: I0131 10:02:48.928209 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" podStartSLOduration=2.49288213 podStartE2EDuration="2.928181182s" podCreationTimestamp="2026-01-31 10:02:46 +0000 UTC" firstStartedPulling="2026-01-31 10:02:47.904285204 +0000 UTC m=+2263.875677191" lastFinishedPulling="2026-01-31 10:02:48.339584216 +0000 UTC m=+2264.310976243" observedRunningTime="2026-01-31 10:02:48.916203508 +0000 UTC m=+2264.887595515" watchObservedRunningTime="2026-01-31 10:02:48.928181182 +0000 UTC m=+2264.899573209" Jan 31 10:02:53 crc kubenswrapper[4992]: I0131 10:02:53.937470 4992 generic.go:334] "Generic (PLEG): container finished" podID="6b39a385-5883-4055-99cf-3c82edc683d6" containerID="a7d79cde95997c3d1e85cccfa590db1bf1194e595141f9ea1f5628ca1b017aa6" exitCode=0 Jan 31 10:02:53 crc kubenswrapper[4992]: I0131 10:02:53.938136 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" event={"ID":"6b39a385-5883-4055-99cf-3c82edc683d6","Type":"ContainerDied","Data":"a7d79cde95997c3d1e85cccfa590db1bf1194e595141f9ea1f5628ca1b017aa6"} Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.324758 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.435607 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-inventory\") pod \"6b39a385-5883-4055-99cf-3c82edc683d6\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.435814 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7frh\" (UniqueName: \"kubernetes.io/projected/6b39a385-5883-4055-99cf-3c82edc683d6-kube-api-access-x7frh\") pod \"6b39a385-5883-4055-99cf-3c82edc683d6\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.435969 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-ceph\") pod \"6b39a385-5883-4055-99cf-3c82edc683d6\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.436026 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-ssh-key-openstack-edpm-ipam\") pod \"6b39a385-5883-4055-99cf-3c82edc683d6\" (UID: \"6b39a385-5883-4055-99cf-3c82edc683d6\") " Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.442408 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b39a385-5883-4055-99cf-3c82edc683d6-kube-api-access-x7frh" (OuterVolumeSpecName: "kube-api-access-x7frh") pod "6b39a385-5883-4055-99cf-3c82edc683d6" (UID: "6b39a385-5883-4055-99cf-3c82edc683d6"). InnerVolumeSpecName "kube-api-access-x7frh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.442947 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-ceph" (OuterVolumeSpecName: "ceph") pod "6b39a385-5883-4055-99cf-3c82edc683d6" (UID: "6b39a385-5883-4055-99cf-3c82edc683d6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.462003 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-inventory" (OuterVolumeSpecName: "inventory") pod "6b39a385-5883-4055-99cf-3c82edc683d6" (UID: "6b39a385-5883-4055-99cf-3c82edc683d6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.539565 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7frh\" (UniqueName: \"kubernetes.io/projected/6b39a385-5883-4055-99cf-3c82edc683d6-kube-api-access-x7frh\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.539799 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.539884 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.540681 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6b39a385-5883-4055-99cf-3c82edc683d6" (UID: "6b39a385-5883-4055-99cf-3c82edc683d6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.642459 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b39a385-5883-4055-99cf-3c82edc683d6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.969387 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" event={"ID":"6b39a385-5883-4055-99cf-3c82edc683d6","Type":"ContainerDied","Data":"17866b878fea0ac871d6b70f9a951876405e24d4cb81d936201a52102a78e6ae"} Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.969478 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17866b878fea0ac871d6b70f9a951876405e24d4cb81d936201a52102a78e6ae" Jan 31 10:02:55 crc kubenswrapper[4992]: I0131 10:02:55.969542 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dh72r" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.055194 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt"] Jan 31 10:02:56 crc kubenswrapper[4992]: E0131 10:02:56.055654 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b39a385-5883-4055-99cf-3c82edc683d6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.055675 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b39a385-5883-4055-99cf-3c82edc683d6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.055842 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b39a385-5883-4055-99cf-3c82edc683d6" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.056429 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.060118 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.060459 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.060715 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.060951 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.061097 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.085255 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt"] Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.252346 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-thgvt\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.252905 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h94j5\" (UniqueName: \"kubernetes.io/projected/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-kube-api-access-h94j5\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-thgvt\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.252983 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-thgvt\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.253164 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-thgvt\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.354986 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-thgvt\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.355045 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h94j5\" (UniqueName: \"kubernetes.io/projected/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-kube-api-access-h94j5\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-thgvt\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.355088 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-thgvt\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.355126 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-thgvt\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.359778 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-thgvt\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.361533 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-thgvt\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.368060 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-thgvt\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.377027 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h94j5\" (UniqueName: \"kubernetes.io/projected/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-kube-api-access-h94j5\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-thgvt\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.382859 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.868842 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt"] Jan 31 10:02:56 crc kubenswrapper[4992]: I0131 10:02:56.977628 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" event={"ID":"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7","Type":"ContainerStarted","Data":"1280d6c2ec31ece44da4becc8f743ddbfd53c6625790f982200edfbeb1f7e997"} Jan 31 10:02:57 crc kubenswrapper[4992]: I0131 10:02:57.989120 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" event={"ID":"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7","Type":"ContainerStarted","Data":"b756a2f5f727f2ac8129d9fd7e9bee7bbfc5ead77a8d5daa70ff2d8daabc2c3b"} Jan 31 10:02:58 crc kubenswrapper[4992]: I0131 10:02:58.021917 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" podStartSLOduration=1.603240874 podStartE2EDuration="2.021896008s" podCreationTimestamp="2026-01-31 10:02:56 +0000 UTC" firstStartedPulling="2026-01-31 10:02:56.870389926 +0000 UTC m=+2272.841781913" lastFinishedPulling="2026-01-31 10:02:57.28904506 +0000 UTC m=+2273.260437047" observedRunningTime="2026-01-31 10:02:58.013292992 +0000 UTC m=+2273.984684999" watchObservedRunningTime="2026-01-31 10:02:58.021896008 +0000 UTC m=+2273.993288005" Jan 31 10:03:15 crc kubenswrapper[4992]: I0131 10:03:15.301625 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:03:15 crc kubenswrapper[4992]: I0131 10:03:15.302607 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:03:15 crc kubenswrapper[4992]: I0131 10:03:15.302697 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 10:03:15 crc kubenswrapper[4992]: I0131 10:03:15.303802 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 10:03:15 crc kubenswrapper[4992]: I0131 10:03:15.303875 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" gracePeriod=600 Jan 31 10:03:15 crc kubenswrapper[4992]: E0131 10:03:15.440988 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:03:16 crc kubenswrapper[4992]: I0131 10:03:16.191640 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" exitCode=0 Jan 31 10:03:16 crc kubenswrapper[4992]: I0131 10:03:16.191738 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81"} Jan 31 10:03:16 crc kubenswrapper[4992]: I0131 10:03:16.192145 4992 scope.go:117] "RemoveContainer" containerID="5eeeb29fa9dd8d5c28d0e393025561263faa39377b4ed1db1e9b4f1fd66917c5" Jan 31 10:03:16 crc kubenswrapper[4992]: I0131 10:03:16.192869 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:03:16 crc kubenswrapper[4992]: E0131 10:03:16.193348 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:03:29 crc kubenswrapper[4992]: I0131 10:03:29.314759 4992 generic.go:334] "Generic (PLEG): container finished" podID="1ad2f8cc-171c-43fc-8bf1-18d22dde00c7" containerID="b756a2f5f727f2ac8129d9fd7e9bee7bbfc5ead77a8d5daa70ff2d8daabc2c3b" exitCode=0 Jan 31 10:03:29 crc kubenswrapper[4992]: I0131 10:03:29.314849 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" event={"ID":"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7","Type":"ContainerDied","Data":"b756a2f5f727f2ac8129d9fd7e9bee7bbfc5ead77a8d5daa70ff2d8daabc2c3b"} Jan 31 10:03:30 crc kubenswrapper[4992]: I0131 10:03:30.725641 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:03:30 crc kubenswrapper[4992]: I0131 10:03:30.826264 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-ceph\") pod \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " Jan 31 10:03:30 crc kubenswrapper[4992]: I0131 10:03:30.826625 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-ssh-key-openstack-edpm-ipam\") pod \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " Jan 31 10:03:30 crc kubenswrapper[4992]: I0131 10:03:30.826684 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h94j5\" (UniqueName: \"kubernetes.io/projected/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-kube-api-access-h94j5\") pod \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " Jan 31 10:03:30 crc kubenswrapper[4992]: I0131 10:03:30.826769 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-inventory\") pod \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\" (UID: \"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7\") " Jan 31 10:03:30 crc kubenswrapper[4992]: I0131 10:03:30.833593 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-ceph" (OuterVolumeSpecName: "ceph") pod "1ad2f8cc-171c-43fc-8bf1-18d22dde00c7" (UID: "1ad2f8cc-171c-43fc-8bf1-18d22dde00c7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:03:30 crc kubenswrapper[4992]: I0131 10:03:30.833633 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-kube-api-access-h94j5" (OuterVolumeSpecName: "kube-api-access-h94j5") pod "1ad2f8cc-171c-43fc-8bf1-18d22dde00c7" (UID: "1ad2f8cc-171c-43fc-8bf1-18d22dde00c7"). InnerVolumeSpecName "kube-api-access-h94j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:03:30 crc kubenswrapper[4992]: I0131 10:03:30.852154 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-inventory" (OuterVolumeSpecName: "inventory") pod "1ad2f8cc-171c-43fc-8bf1-18d22dde00c7" (UID: "1ad2f8cc-171c-43fc-8bf1-18d22dde00c7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:03:30 crc kubenswrapper[4992]: I0131 10:03:30.870237 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1ad2f8cc-171c-43fc-8bf1-18d22dde00c7" (UID: "1ad2f8cc-171c-43fc-8bf1-18d22dde00c7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:03:30 crc kubenswrapper[4992]: I0131 10:03:30.929571 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:03:30 crc kubenswrapper[4992]: I0131 10:03:30.929615 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h94j5\" (UniqueName: \"kubernetes.io/projected/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-kube-api-access-h94j5\") on node \"crc\" DevicePath \"\"" Jan 31 10:03:30 crc kubenswrapper[4992]: I0131 10:03:30.929628 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 10:03:30 crc kubenswrapper[4992]: I0131 10:03:30.929640 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ad2f8cc-171c-43fc-8bf1-18d22dde00c7-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.182697 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:03:31 crc kubenswrapper[4992]: E0131 10:03:31.183102 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.343573 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" event={"ID":"1ad2f8cc-171c-43fc-8bf1-18d22dde00c7","Type":"ContainerDied","Data":"1280d6c2ec31ece44da4becc8f743ddbfd53c6625790f982200edfbeb1f7e997"} Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.343634 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-thgvt" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.343638 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1280d6c2ec31ece44da4becc8f743ddbfd53c6625790f982200edfbeb1f7e997" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.442253 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv"] Jan 31 10:03:31 crc kubenswrapper[4992]: E0131 10:03:31.442673 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad2f8cc-171c-43fc-8bf1-18d22dde00c7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.442697 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad2f8cc-171c-43fc-8bf1-18d22dde00c7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.442953 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ad2f8cc-171c-43fc-8bf1-18d22dde00c7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.443681 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.447276 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.447604 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.447739 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.447938 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.448121 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.486600 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv"] Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.541965 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.542039 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.542088 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndgvb\" (UniqueName: \"kubernetes.io/projected/22c03d00-71b7-4e60-8f46-1373c8cba767-kube-api-access-ndgvb\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.542281 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.643795 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.643942 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.644017 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.644064 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndgvb\" (UniqueName: \"kubernetes.io/projected/22c03d00-71b7-4e60-8f46-1373c8cba767-kube-api-access-ndgvb\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.647647 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.649092 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.651154 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.661350 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndgvb\" (UniqueName: \"kubernetes.io/projected/22c03d00-71b7-4e60-8f46-1373c8cba767-kube-api-access-ndgvb\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:31 crc kubenswrapper[4992]: I0131 10:03:31.768956 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:32 crc kubenswrapper[4992]: I0131 10:03:32.293289 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv"] Jan 31 10:03:32 crc kubenswrapper[4992]: W0131 10:03:32.304613 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22c03d00_71b7_4e60_8f46_1373c8cba767.slice/crio-e935fba4f9e3a30ed30cfa4a0841e34844e543bebb0a8c4710cdc5509a2a34e1 WatchSource:0}: Error finding container e935fba4f9e3a30ed30cfa4a0841e34844e543bebb0a8c4710cdc5509a2a34e1: Status 404 returned error can't find the container with id e935fba4f9e3a30ed30cfa4a0841e34844e543bebb0a8c4710cdc5509a2a34e1 Jan 31 10:03:32 crc kubenswrapper[4992]: I0131 10:03:32.355208 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" event={"ID":"22c03d00-71b7-4e60-8f46-1373c8cba767","Type":"ContainerStarted","Data":"e935fba4f9e3a30ed30cfa4a0841e34844e543bebb0a8c4710cdc5509a2a34e1"} Jan 31 10:03:33 crc kubenswrapper[4992]: I0131 10:03:33.368313 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" event={"ID":"22c03d00-71b7-4e60-8f46-1373c8cba767","Type":"ContainerStarted","Data":"073efd658e004b5ef060d93ef319a3271fe2dff71f6ae35c2832c986a44ddaa8"} Jan 31 10:03:36 crc kubenswrapper[4992]: I0131 10:03:36.392593 4992 generic.go:334] "Generic (PLEG): container finished" podID="22c03d00-71b7-4e60-8f46-1373c8cba767" containerID="073efd658e004b5ef060d93ef319a3271fe2dff71f6ae35c2832c986a44ddaa8" exitCode=0 Jan 31 10:03:36 crc kubenswrapper[4992]: I0131 10:03:36.392686 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" event={"ID":"22c03d00-71b7-4e60-8f46-1373c8cba767","Type":"ContainerDied","Data":"073efd658e004b5ef060d93ef319a3271fe2dff71f6ae35c2832c986a44ddaa8"} Jan 31 10:03:37 crc kubenswrapper[4992]: I0131 10:03:37.829723 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:37 crc kubenswrapper[4992]: I0131 10:03:37.878369 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-ssh-key-openstack-edpm-ipam\") pod \"22c03d00-71b7-4e60-8f46-1373c8cba767\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " Jan 31 10:03:37 crc kubenswrapper[4992]: I0131 10:03:37.878465 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndgvb\" (UniqueName: \"kubernetes.io/projected/22c03d00-71b7-4e60-8f46-1373c8cba767-kube-api-access-ndgvb\") pod \"22c03d00-71b7-4e60-8f46-1373c8cba767\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " Jan 31 10:03:37 crc kubenswrapper[4992]: I0131 10:03:37.878681 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-ceph\") pod \"22c03d00-71b7-4e60-8f46-1373c8cba767\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " Jan 31 10:03:37 crc kubenswrapper[4992]: I0131 10:03:37.878772 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-inventory\") pod \"22c03d00-71b7-4e60-8f46-1373c8cba767\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " Jan 31 10:03:37 crc kubenswrapper[4992]: I0131 10:03:37.890306 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c03d00-71b7-4e60-8f46-1373c8cba767-kube-api-access-ndgvb" (OuterVolumeSpecName: "kube-api-access-ndgvb") pod "22c03d00-71b7-4e60-8f46-1373c8cba767" (UID: "22c03d00-71b7-4e60-8f46-1373c8cba767"). InnerVolumeSpecName "kube-api-access-ndgvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:03:37 crc kubenswrapper[4992]: I0131 10:03:37.890662 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-ceph" (OuterVolumeSpecName: "ceph") pod "22c03d00-71b7-4e60-8f46-1373c8cba767" (UID: "22c03d00-71b7-4e60-8f46-1373c8cba767"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:03:37 crc kubenswrapper[4992]: E0131 10:03:37.907694 4992 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-ssh-key-openstack-edpm-ipam podName:22c03d00-71b7-4e60-8f46-1373c8cba767 nodeName:}" failed. No retries permitted until 2026-01-31 10:03:38.407665613 +0000 UTC m=+2314.379057600 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-ssh-key-openstack-edpm-ipam") pod "22c03d00-71b7-4e60-8f46-1373c8cba767" (UID: "22c03d00-71b7-4e60-8f46-1373c8cba767") : error deleting /var/lib/kubelet/pods/22c03d00-71b7-4e60-8f46-1373c8cba767/volume-subpaths: remove /var/lib/kubelet/pods/22c03d00-71b7-4e60-8f46-1373c8cba767/volume-subpaths: no such file or directory Jan 31 10:03:37 crc kubenswrapper[4992]: I0131 10:03:37.910646 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-inventory" (OuterVolumeSpecName: "inventory") pod "22c03d00-71b7-4e60-8f46-1373c8cba767" (UID: "22c03d00-71b7-4e60-8f46-1373c8cba767"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:03:37 crc kubenswrapper[4992]: I0131 10:03:37.980658 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndgvb\" (UniqueName: \"kubernetes.io/projected/22c03d00-71b7-4e60-8f46-1373c8cba767-kube-api-access-ndgvb\") on node \"crc\" DevicePath \"\"" Jan 31 10:03:37 crc kubenswrapper[4992]: I0131 10:03:37.980698 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:03:37 crc kubenswrapper[4992]: I0131 10:03:37.980715 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.407945 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" event={"ID":"22c03d00-71b7-4e60-8f46-1373c8cba767","Type":"ContainerDied","Data":"e935fba4f9e3a30ed30cfa4a0841e34844e543bebb0a8c4710cdc5509a2a34e1"} Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.407991 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e935fba4f9e3a30ed30cfa4a0841e34844e543bebb0a8c4710cdc5509a2a34e1" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.408017 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.486480 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-ssh-key-openstack-edpm-ipam\") pod \"22c03d00-71b7-4e60-8f46-1373c8cba767\" (UID: \"22c03d00-71b7-4e60-8f46-1373c8cba767\") " Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.490367 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "22c03d00-71b7-4e60-8f46-1373c8cba767" (UID: "22c03d00-71b7-4e60-8f46-1373c8cba767"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.493261 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995"] Jan 31 10:03:38 crc kubenswrapper[4992]: E0131 10:03:38.493712 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c03d00-71b7-4e60-8f46-1373c8cba767" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.493736 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c03d00-71b7-4e60-8f46-1373c8cba767" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.493966 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c03d00-71b7-4e60-8f46-1373c8cba767" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.494943 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.502479 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995"] Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.589851 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/22c03d00-71b7-4e60-8f46-1373c8cba767-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.691776 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kz995\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.691880 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26lk7\" (UniqueName: \"kubernetes.io/projected/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-kube-api-access-26lk7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kz995\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.691948 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kz995\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.692047 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kz995\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.793248 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kz995\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.793340 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kz995\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.793952 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26lk7\" (UniqueName: \"kubernetes.io/projected/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-kube-api-access-26lk7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kz995\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.794062 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kz995\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.799209 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kz995\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.799222 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kz995\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.801016 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kz995\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.810939 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26lk7\" (UniqueName: \"kubernetes.io/projected/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-kube-api-access-26lk7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kz995\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:03:38 crc kubenswrapper[4992]: I0131 10:03:38.851072 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:03:39 crc kubenswrapper[4992]: I0131 10:03:39.359229 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995"] Jan 31 10:03:39 crc kubenswrapper[4992]: I0131 10:03:39.415850 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" event={"ID":"663638e3-cdf5-47cb-9515-0c6e0ae1f11a","Type":"ContainerStarted","Data":"5f51a3e7eb5aa3307da0cad835ae959c5051a614c1c0f37d722e1aec2179bdd4"} Jan 31 10:03:40 crc kubenswrapper[4992]: I0131 10:03:40.426486 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" event={"ID":"663638e3-cdf5-47cb-9515-0c6e0ae1f11a","Type":"ContainerStarted","Data":"5d4cfea5fe289305f15ace715c8f6eccfc5dcf6f1414059f1b77fab36f50dfab"} Jan 31 10:03:40 crc kubenswrapper[4992]: I0131 10:03:40.451743 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" podStartSLOduration=2.02021624 podStartE2EDuration="2.451726684s" podCreationTimestamp="2026-01-31 10:03:38 +0000 UTC" firstStartedPulling="2026-01-31 10:03:39.360147481 +0000 UTC m=+2315.331539478" lastFinishedPulling="2026-01-31 10:03:39.791657935 +0000 UTC m=+2315.763049922" observedRunningTime="2026-01-31 10:03:40.442359295 +0000 UTC m=+2316.413751282" watchObservedRunningTime="2026-01-31 10:03:40.451726684 +0000 UTC m=+2316.423118661" Jan 31 10:03:46 crc kubenswrapper[4992]: I0131 10:03:46.182504 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:03:46 crc kubenswrapper[4992]: E0131 10:03:46.183307 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:03:57 crc kubenswrapper[4992]: I0131 10:03:57.183151 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:03:57 crc kubenswrapper[4992]: E0131 10:03:57.183805 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:04:08 crc kubenswrapper[4992]: I0131 10:04:08.182856 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:04:08 crc kubenswrapper[4992]: E0131 10:04:08.183702 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:04:16 crc kubenswrapper[4992]: I0131 10:04:16.739510 4992 generic.go:334] "Generic (PLEG): container finished" podID="663638e3-cdf5-47cb-9515-0c6e0ae1f11a" containerID="5d4cfea5fe289305f15ace715c8f6eccfc5dcf6f1414059f1b77fab36f50dfab" exitCode=0 Jan 31 10:04:16 crc kubenswrapper[4992]: I0131 10:04:16.740131 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" event={"ID":"663638e3-cdf5-47cb-9515-0c6e0ae1f11a","Type":"ContainerDied","Data":"5d4cfea5fe289305f15ace715c8f6eccfc5dcf6f1414059f1b77fab36f50dfab"} Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.180728 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.371005 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26lk7\" (UniqueName: \"kubernetes.io/projected/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-kube-api-access-26lk7\") pod \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.371266 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-inventory\") pod \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.371374 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-ceph\") pod \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.371479 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-ssh-key-openstack-edpm-ipam\") pod \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\" (UID: \"663638e3-cdf5-47cb-9515-0c6e0ae1f11a\") " Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.376983 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-ceph" (OuterVolumeSpecName: "ceph") pod "663638e3-cdf5-47cb-9515-0c6e0ae1f11a" (UID: "663638e3-cdf5-47cb-9515-0c6e0ae1f11a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.377109 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-kube-api-access-26lk7" (OuterVolumeSpecName: "kube-api-access-26lk7") pod "663638e3-cdf5-47cb-9515-0c6e0ae1f11a" (UID: "663638e3-cdf5-47cb-9515-0c6e0ae1f11a"). InnerVolumeSpecName "kube-api-access-26lk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.400858 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-inventory" (OuterVolumeSpecName: "inventory") pod "663638e3-cdf5-47cb-9515-0c6e0ae1f11a" (UID: "663638e3-cdf5-47cb-9515-0c6e0ae1f11a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.418995 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "663638e3-cdf5-47cb-9515-0c6e0ae1f11a" (UID: "663638e3-cdf5-47cb-9515-0c6e0ae1f11a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.474272 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.474301 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.474313 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.474327 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26lk7\" (UniqueName: \"kubernetes.io/projected/663638e3-cdf5-47cb-9515-0c6e0ae1f11a-kube-api-access-26lk7\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.755787 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" event={"ID":"663638e3-cdf5-47cb-9515-0c6e0ae1f11a","Type":"ContainerDied","Data":"5f51a3e7eb5aa3307da0cad835ae959c5051a614c1c0f37d722e1aec2179bdd4"} Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.755825 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f51a3e7eb5aa3307da0cad835ae959c5051a614c1c0f37d722e1aec2179bdd4" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.755839 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kz995" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.865892 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-7rddg"] Jan 31 10:04:18 crc kubenswrapper[4992]: E0131 10:04:18.866315 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="663638e3-cdf5-47cb-9515-0c6e0ae1f11a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.866340 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="663638e3-cdf5-47cb-9515-0c6e0ae1f11a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.866637 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="663638e3-cdf5-47cb-9515-0c6e0ae1f11a" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.867360 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.870730 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.871105 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.872946 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.873125 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.874249 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.898172 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-7rddg"] Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.984737 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zqz8\" (UniqueName: \"kubernetes.io/projected/1a869ced-d71b-45ea-9e5f-f2f83646d603-kube-api-access-8zqz8\") pod \"ssh-known-hosts-edpm-deployment-7rddg\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.984819 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-7rddg\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.984919 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-ceph\") pod \"ssh-known-hosts-edpm-deployment-7rddg\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:18 crc kubenswrapper[4992]: I0131 10:04:18.984948 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-7rddg\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:19 crc kubenswrapper[4992]: I0131 10:04:19.086363 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-ceph\") pod \"ssh-known-hosts-edpm-deployment-7rddg\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:19 crc kubenswrapper[4992]: I0131 10:04:19.086428 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-7rddg\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:19 crc kubenswrapper[4992]: I0131 10:04:19.086519 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zqz8\" (UniqueName: \"kubernetes.io/projected/1a869ced-d71b-45ea-9e5f-f2f83646d603-kube-api-access-8zqz8\") pod \"ssh-known-hosts-edpm-deployment-7rddg\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:19 crc kubenswrapper[4992]: I0131 10:04:19.086572 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-7rddg\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:19 crc kubenswrapper[4992]: I0131 10:04:19.091043 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-7rddg\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:19 crc kubenswrapper[4992]: I0131 10:04:19.091210 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-ceph\") pod \"ssh-known-hosts-edpm-deployment-7rddg\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:19 crc kubenswrapper[4992]: I0131 10:04:19.091295 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-7rddg\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:19 crc kubenswrapper[4992]: I0131 10:04:19.111659 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zqz8\" (UniqueName: \"kubernetes.io/projected/1a869ced-d71b-45ea-9e5f-f2f83646d603-kube-api-access-8zqz8\") pod \"ssh-known-hosts-edpm-deployment-7rddg\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:19 crc kubenswrapper[4992]: I0131 10:04:19.182303 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:19 crc kubenswrapper[4992]: I0131 10:04:19.709159 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-7rddg"] Jan 31 10:04:19 crc kubenswrapper[4992]: W0131 10:04:19.720039 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a869ced_d71b_45ea_9e5f_f2f83646d603.slice/crio-26f18ba48b3922012b317befe5a430eca74047d7c11425dd84fe98302ea102ad WatchSource:0}: Error finding container 26f18ba48b3922012b317befe5a430eca74047d7c11425dd84fe98302ea102ad: Status 404 returned error can't find the container with id 26f18ba48b3922012b317befe5a430eca74047d7c11425dd84fe98302ea102ad Jan 31 10:04:19 crc kubenswrapper[4992]: I0131 10:04:19.769288 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" event={"ID":"1a869ced-d71b-45ea-9e5f-f2f83646d603","Type":"ContainerStarted","Data":"26f18ba48b3922012b317befe5a430eca74047d7c11425dd84fe98302ea102ad"} Jan 31 10:04:20 crc kubenswrapper[4992]: I0131 10:04:20.183662 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:04:20 crc kubenswrapper[4992]: E0131 10:04:20.184025 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:04:20 crc kubenswrapper[4992]: I0131 10:04:20.790243 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" event={"ID":"1a869ced-d71b-45ea-9e5f-f2f83646d603","Type":"ContainerStarted","Data":"21ef45dc41b87b2b9a4012e3db40e1ef4477c3f1255441027c742f1331f034bd"} Jan 31 10:04:20 crc kubenswrapper[4992]: I0131 10:04:20.810132 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" podStartSLOduration=2.282375135 podStartE2EDuration="2.810114883s" podCreationTimestamp="2026-01-31 10:04:18 +0000 UTC" firstStartedPulling="2026-01-31 10:04:19.723948187 +0000 UTC m=+2355.695340184" lastFinishedPulling="2026-01-31 10:04:20.251687945 +0000 UTC m=+2356.223079932" observedRunningTime="2026-01-31 10:04:20.806411157 +0000 UTC m=+2356.777803234" watchObservedRunningTime="2026-01-31 10:04:20.810114883 +0000 UTC m=+2356.781506870" Jan 31 10:04:28 crc kubenswrapper[4992]: I0131 10:04:28.878109 4992 generic.go:334] "Generic (PLEG): container finished" podID="1a869ced-d71b-45ea-9e5f-f2f83646d603" containerID="21ef45dc41b87b2b9a4012e3db40e1ef4477c3f1255441027c742f1331f034bd" exitCode=0 Jan 31 10:04:28 crc kubenswrapper[4992]: I0131 10:04:28.878224 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" event={"ID":"1a869ced-d71b-45ea-9e5f-f2f83646d603","Type":"ContainerDied","Data":"21ef45dc41b87b2b9a4012e3db40e1ef4477c3f1255441027c742f1331f034bd"} Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.287586 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.393793 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-ssh-key-openstack-edpm-ipam\") pod \"1a869ced-d71b-45ea-9e5f-f2f83646d603\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.393909 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zqz8\" (UniqueName: \"kubernetes.io/projected/1a869ced-d71b-45ea-9e5f-f2f83646d603-kube-api-access-8zqz8\") pod \"1a869ced-d71b-45ea-9e5f-f2f83646d603\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.393956 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-ceph\") pod \"1a869ced-d71b-45ea-9e5f-f2f83646d603\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.394036 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-inventory-0\") pod \"1a869ced-d71b-45ea-9e5f-f2f83646d603\" (UID: \"1a869ced-d71b-45ea-9e5f-f2f83646d603\") " Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.399543 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-ceph" (OuterVolumeSpecName: "ceph") pod "1a869ced-d71b-45ea-9e5f-f2f83646d603" (UID: "1a869ced-d71b-45ea-9e5f-f2f83646d603"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.413516 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a869ced-d71b-45ea-9e5f-f2f83646d603-kube-api-access-8zqz8" (OuterVolumeSpecName: "kube-api-access-8zqz8") pod "1a869ced-d71b-45ea-9e5f-f2f83646d603" (UID: "1a869ced-d71b-45ea-9e5f-f2f83646d603"). InnerVolumeSpecName "kube-api-access-8zqz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.424744 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "1a869ced-d71b-45ea-9e5f-f2f83646d603" (UID: "1a869ced-d71b-45ea-9e5f-f2f83646d603"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.449894 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1a869ced-d71b-45ea-9e5f-f2f83646d603" (UID: "1a869ced-d71b-45ea-9e5f-f2f83646d603"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.496269 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.496537 4992 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.496613 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1a869ced-d71b-45ea-9e5f-f2f83646d603-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.496683 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zqz8\" (UniqueName: \"kubernetes.io/projected/1a869ced-d71b-45ea-9e5f-f2f83646d603-kube-api-access-8zqz8\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.898886 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" event={"ID":"1a869ced-d71b-45ea-9e5f-f2f83646d603","Type":"ContainerDied","Data":"26f18ba48b3922012b317befe5a430eca74047d7c11425dd84fe98302ea102ad"} Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.899371 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26f18ba48b3922012b317befe5a430eca74047d7c11425dd84fe98302ea102ad" Jan 31 10:04:30 crc kubenswrapper[4992]: I0131 10:04:30.898960 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-7rddg" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.020832 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4"] Jan 31 10:04:31 crc kubenswrapper[4992]: E0131 10:04:31.021776 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a869ced-d71b-45ea-9e5f-f2f83646d603" containerName="ssh-known-hosts-edpm-deployment" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.021938 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a869ced-d71b-45ea-9e5f-f2f83646d603" containerName="ssh-known-hosts-edpm-deployment" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.022455 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a869ced-d71b-45ea-9e5f-f2f83646d603" containerName="ssh-known-hosts-edpm-deployment" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.023634 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.027095 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.027194 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.027844 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.028327 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.028542 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.039010 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4"] Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.107643 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5zmv4\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.107708 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5zmv4\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.107785 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqwrh\" (UniqueName: \"kubernetes.io/projected/b936aa38-b3af-4639-90d0-d54936217a7e-kube-api-access-tqwrh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5zmv4\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.108132 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5zmv4\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.210213 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5zmv4\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.210433 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5zmv4\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.210490 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5zmv4\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.210635 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqwrh\" (UniqueName: \"kubernetes.io/projected/b936aa38-b3af-4639-90d0-d54936217a7e-kube-api-access-tqwrh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5zmv4\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.217569 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5zmv4\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.217852 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5zmv4\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.218622 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5zmv4\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.251510 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqwrh\" (UniqueName: \"kubernetes.io/projected/b936aa38-b3af-4639-90d0-d54936217a7e-kube-api-access-tqwrh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5zmv4\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.350586 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.864834 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4"] Jan 31 10:04:31 crc kubenswrapper[4992]: W0131 10:04:31.871193 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb936aa38_b3af_4639_90d0_d54936217a7e.slice/crio-5588b7f5affd4a74650160bb288defc63d833431ac0240664ce60b5b297d6540 WatchSource:0}: Error finding container 5588b7f5affd4a74650160bb288defc63d833431ac0240664ce60b5b297d6540: Status 404 returned error can't find the container with id 5588b7f5affd4a74650160bb288defc63d833431ac0240664ce60b5b297d6540 Jan 31 10:04:31 crc kubenswrapper[4992]: I0131 10:04:31.906282 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" event={"ID":"b936aa38-b3af-4639-90d0-d54936217a7e","Type":"ContainerStarted","Data":"5588b7f5affd4a74650160bb288defc63d833431ac0240664ce60b5b297d6540"} Jan 31 10:04:32 crc kubenswrapper[4992]: I0131 10:04:32.917628 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" event={"ID":"b936aa38-b3af-4639-90d0-d54936217a7e","Type":"ContainerStarted","Data":"570f148ebdaa13e07ec69d18be65e1ccab4ed8a85839f1a278cf9d83af0b9b84"} Jan 31 10:04:32 crc kubenswrapper[4992]: I0131 10:04:32.948743 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" podStartSLOduration=2.500046288 podStartE2EDuration="2.948723285s" podCreationTimestamp="2026-01-31 10:04:30 +0000 UTC" firstStartedPulling="2026-01-31 10:04:31.875559901 +0000 UTC m=+2367.846951888" lastFinishedPulling="2026-01-31 10:04:32.324236898 +0000 UTC m=+2368.295628885" observedRunningTime="2026-01-31 10:04:32.942814395 +0000 UTC m=+2368.914206402" watchObservedRunningTime="2026-01-31 10:04:32.948723285 +0000 UTC m=+2368.920115292" Jan 31 10:04:35 crc kubenswrapper[4992]: I0131 10:04:35.194500 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:04:35 crc kubenswrapper[4992]: E0131 10:04:35.195189 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:04:39 crc kubenswrapper[4992]: I0131 10:04:39.982490 4992 generic.go:334] "Generic (PLEG): container finished" podID="b936aa38-b3af-4639-90d0-d54936217a7e" containerID="570f148ebdaa13e07ec69d18be65e1ccab4ed8a85839f1a278cf9d83af0b9b84" exitCode=0 Jan 31 10:04:39 crc kubenswrapper[4992]: I0131 10:04:39.982582 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" event={"ID":"b936aa38-b3af-4639-90d0-d54936217a7e","Type":"ContainerDied","Data":"570f148ebdaa13e07ec69d18be65e1ccab4ed8a85839f1a278cf9d83af0b9b84"} Jan 31 10:04:41 crc kubenswrapper[4992]: I0131 10:04:41.393039 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:41 crc kubenswrapper[4992]: I0131 10:04:41.450677 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-ssh-key-openstack-edpm-ipam\") pod \"b936aa38-b3af-4639-90d0-d54936217a7e\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " Jan 31 10:04:41 crc kubenswrapper[4992]: I0131 10:04:41.450807 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-ceph\") pod \"b936aa38-b3af-4639-90d0-d54936217a7e\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " Jan 31 10:04:41 crc kubenswrapper[4992]: I0131 10:04:41.450894 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-inventory\") pod \"b936aa38-b3af-4639-90d0-d54936217a7e\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " Jan 31 10:04:41 crc kubenswrapper[4992]: I0131 10:04:41.450975 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqwrh\" (UniqueName: \"kubernetes.io/projected/b936aa38-b3af-4639-90d0-d54936217a7e-kube-api-access-tqwrh\") pod \"b936aa38-b3af-4639-90d0-d54936217a7e\" (UID: \"b936aa38-b3af-4639-90d0-d54936217a7e\") " Jan 31 10:04:41 crc kubenswrapper[4992]: I0131 10:04:41.463557 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-ceph" (OuterVolumeSpecName: "ceph") pod "b936aa38-b3af-4639-90d0-d54936217a7e" (UID: "b936aa38-b3af-4639-90d0-d54936217a7e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:04:41 crc kubenswrapper[4992]: I0131 10:04:41.464037 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b936aa38-b3af-4639-90d0-d54936217a7e-kube-api-access-tqwrh" (OuterVolumeSpecName: "kube-api-access-tqwrh") pod "b936aa38-b3af-4639-90d0-d54936217a7e" (UID: "b936aa38-b3af-4639-90d0-d54936217a7e"). InnerVolumeSpecName "kube-api-access-tqwrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:04:41 crc kubenswrapper[4992]: I0131 10:04:41.484808 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b936aa38-b3af-4639-90d0-d54936217a7e" (UID: "b936aa38-b3af-4639-90d0-d54936217a7e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:04:41 crc kubenswrapper[4992]: I0131 10:04:41.487532 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-inventory" (OuterVolumeSpecName: "inventory") pod "b936aa38-b3af-4639-90d0-d54936217a7e" (UID: "b936aa38-b3af-4639-90d0-d54936217a7e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:04:41 crc kubenswrapper[4992]: I0131 10:04:41.552894 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:41 crc kubenswrapper[4992]: I0131 10:04:41.552924 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:41 crc kubenswrapper[4992]: I0131 10:04:41.552933 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b936aa38-b3af-4639-90d0-d54936217a7e-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:41 crc kubenswrapper[4992]: I0131 10:04:41.552944 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqwrh\" (UniqueName: \"kubernetes.io/projected/b936aa38-b3af-4639-90d0-d54936217a7e-kube-api-access-tqwrh\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.004639 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" event={"ID":"b936aa38-b3af-4639-90d0-d54936217a7e","Type":"ContainerDied","Data":"5588b7f5affd4a74650160bb288defc63d833431ac0240664ce60b5b297d6540"} Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.004678 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5588b7f5affd4a74650160bb288defc63d833431ac0240664ce60b5b297d6540" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.004714 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5zmv4" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.074961 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv"] Jan 31 10:04:42 crc kubenswrapper[4992]: E0131 10:04:42.075399 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b936aa38-b3af-4639-90d0-d54936217a7e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.075435 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="b936aa38-b3af-4639-90d0-d54936217a7e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.076556 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="b936aa38-b3af-4639-90d0-d54936217a7e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.077275 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.079947 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.080135 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.080240 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.080709 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.080917 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.082739 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv"] Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.169135 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.169442 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.169562 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.169669 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-985bs\" (UniqueName: \"kubernetes.io/projected/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-kube-api-access-985bs\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.272249 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.272323 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.272412 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-985bs\" (UniqueName: \"kubernetes.io/projected/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-kube-api-access-985bs\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.272574 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.276995 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.277204 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.277833 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.289104 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-985bs\" (UniqueName: \"kubernetes.io/projected/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-kube-api-access-985bs\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.411184 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:42 crc kubenswrapper[4992]: I0131 10:04:42.967566 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv"] Jan 31 10:04:43 crc kubenswrapper[4992]: I0131 10:04:43.020263 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" event={"ID":"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540","Type":"ContainerStarted","Data":"4428399d4f904cfc0aa78a00e59890c1259a1bab4777f165ad733843f0e5b298"} Jan 31 10:04:44 crc kubenswrapper[4992]: I0131 10:04:44.030546 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" event={"ID":"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540","Type":"ContainerStarted","Data":"4a7a6cd1f4016dbb8a9477e88612f8ebf307fa5acc49801b5ea548a8c20db8e9"} Jan 31 10:04:44 crc kubenswrapper[4992]: I0131 10:04:44.055149 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" podStartSLOduration=1.5142970249999999 podStartE2EDuration="2.055129839s" podCreationTimestamp="2026-01-31 10:04:42 +0000 UTC" firstStartedPulling="2026-01-31 10:04:43.001122136 +0000 UTC m=+2378.972514133" lastFinishedPulling="2026-01-31 10:04:43.54195495 +0000 UTC m=+2379.513346947" observedRunningTime="2026-01-31 10:04:44.047093468 +0000 UTC m=+2380.018485465" watchObservedRunningTime="2026-01-31 10:04:44.055129839 +0000 UTC m=+2380.026521826" Jan 31 10:04:49 crc kubenswrapper[4992]: I0131 10:04:49.183193 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:04:49 crc kubenswrapper[4992]: E0131 10:04:49.184354 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:04:53 crc kubenswrapper[4992]: I0131 10:04:53.115843 4992 generic.go:334] "Generic (PLEG): container finished" podID="c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540" containerID="4a7a6cd1f4016dbb8a9477e88612f8ebf307fa5acc49801b5ea548a8c20db8e9" exitCode=0 Jan 31 10:04:53 crc kubenswrapper[4992]: I0131 10:04:53.115966 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" event={"ID":"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540","Type":"ContainerDied","Data":"4a7a6cd1f4016dbb8a9477e88612f8ebf307fa5acc49801b5ea548a8c20db8e9"} Jan 31 10:04:54 crc kubenswrapper[4992]: I0131 10:04:54.522387 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:54 crc kubenswrapper[4992]: I0131 10:04:54.611159 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-inventory\") pod \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " Jan 31 10:04:54 crc kubenswrapper[4992]: I0131 10:04:54.611527 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-ceph\") pod \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " Jan 31 10:04:54 crc kubenswrapper[4992]: I0131 10:04:54.611617 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-985bs\" (UniqueName: \"kubernetes.io/projected/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-kube-api-access-985bs\") pod \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " Jan 31 10:04:54 crc kubenswrapper[4992]: I0131 10:04:54.611697 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-ssh-key-openstack-edpm-ipam\") pod \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\" (UID: \"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540\") " Jan 31 10:04:54 crc kubenswrapper[4992]: I0131 10:04:54.617135 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-kube-api-access-985bs" (OuterVolumeSpecName: "kube-api-access-985bs") pod "c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540" (UID: "c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540"). InnerVolumeSpecName "kube-api-access-985bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:04:54 crc kubenswrapper[4992]: I0131 10:04:54.617195 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-ceph" (OuterVolumeSpecName: "ceph") pod "c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540" (UID: "c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:04:54 crc kubenswrapper[4992]: I0131 10:04:54.635280 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-inventory" (OuterVolumeSpecName: "inventory") pod "c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540" (UID: "c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:04:54 crc kubenswrapper[4992]: I0131 10:04:54.651373 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540" (UID: "c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:04:54 crc kubenswrapper[4992]: I0131 10:04:54.713536 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:54 crc kubenswrapper[4992]: I0131 10:04:54.713569 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:54 crc kubenswrapper[4992]: I0131 10:04:54.713582 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-985bs\" (UniqueName: \"kubernetes.io/projected/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-kube-api-access-985bs\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:54 crc kubenswrapper[4992]: I0131 10:04:54.713596 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.147843 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" event={"ID":"c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540","Type":"ContainerDied","Data":"4428399d4f904cfc0aa78a00e59890c1259a1bab4777f165ad733843f0e5b298"} Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.147879 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4428399d4f904cfc0aa78a00e59890c1259a1bab4777f165ad733843f0e5b298" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.148215 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.309401 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9"] Jan 31 10:04:55 crc kubenswrapper[4992]: E0131 10:04:55.309901 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.309925 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.310106 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.310708 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.312835 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.314744 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.314848 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.315045 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.315188 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.315610 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.316728 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.316946 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.325039 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9"] Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.425127 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plzgb\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-kube-api-access-plzgb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.425262 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.425343 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.425405 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.425491 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.425531 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.425575 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.425611 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.425642 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.425680 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.425705 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.425731 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.425805 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.527343 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.527444 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plzgb\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-kube-api-access-plzgb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.527475 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.527506 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.527534 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.527567 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.527586 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.527613 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.527634 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.527654 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.527678 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.527697 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.527722 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.532599 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.532635 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.533078 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.533104 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.533880 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.534294 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.534523 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.534964 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.535551 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.536491 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.537525 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.537959 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.552575 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plzgb\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-kube-api-access-plzgb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:55 crc kubenswrapper[4992]: I0131 10:04:55.625935 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:04:56 crc kubenswrapper[4992]: I0131 10:04:56.153354 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9"] Jan 31 10:04:56 crc kubenswrapper[4992]: I0131 10:04:56.167762 4992 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 10:04:57 crc kubenswrapper[4992]: I0131 10:04:57.164354 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" event={"ID":"69ab53db-b628-4853-9f28-c81ab402290b","Type":"ContainerStarted","Data":"07d78d280ab5981642e488848d8d36efd1b5a5fc74431f40a543f43441ec2dec"} Jan 31 10:04:57 crc kubenswrapper[4992]: I0131 10:04:57.165860 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" event={"ID":"69ab53db-b628-4853-9f28-c81ab402290b","Type":"ContainerStarted","Data":"db11c301e957dc1060c0db849abbcc5f1bd5f0a206060f3ddb973d9d640f140e"} Jan 31 10:04:57 crc kubenswrapper[4992]: I0131 10:04:57.183868 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" podStartSLOduration=1.703557382 podStartE2EDuration="2.183847847s" podCreationTimestamp="2026-01-31 10:04:55 +0000 UTC" firstStartedPulling="2026-01-31 10:04:56.167545847 +0000 UTC m=+2392.138937834" lastFinishedPulling="2026-01-31 10:04:56.647836312 +0000 UTC m=+2392.619228299" observedRunningTime="2026-01-31 10:04:57.183466126 +0000 UTC m=+2393.154858123" watchObservedRunningTime="2026-01-31 10:04:57.183847847 +0000 UTC m=+2393.155239834" Jan 31 10:05:01 crc kubenswrapper[4992]: I0131 10:05:01.182853 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:05:01 crc kubenswrapper[4992]: E0131 10:05:01.183597 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:05:12 crc kubenswrapper[4992]: I0131 10:05:12.182740 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:05:12 crc kubenswrapper[4992]: E0131 10:05:12.183827 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:05:25 crc kubenswrapper[4992]: I0131 10:05:25.392838 4992 generic.go:334] "Generic (PLEG): container finished" podID="69ab53db-b628-4853-9f28-c81ab402290b" containerID="07d78d280ab5981642e488848d8d36efd1b5a5fc74431f40a543f43441ec2dec" exitCode=0 Jan 31 10:05:25 crc kubenswrapper[4992]: I0131 10:05:25.392953 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" event={"ID":"69ab53db-b628-4853-9f28-c81ab402290b","Type":"ContainerDied","Data":"07d78d280ab5981642e488848d8d36efd1b5a5fc74431f40a543f43441ec2dec"} Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.182451 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:05:26 crc kubenswrapper[4992]: E0131 10:05:26.182943 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.770601 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.847278 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"69ab53db-b628-4853-9f28-c81ab402290b\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.847393 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-neutron-metadata-combined-ca-bundle\") pod \"69ab53db-b628-4853-9f28-c81ab402290b\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.847487 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-libvirt-combined-ca-bundle\") pod \"69ab53db-b628-4853-9f28-c81ab402290b\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.847546 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-inventory\") pod \"69ab53db-b628-4853-9f28-c81ab402290b\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.847576 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"69ab53db-b628-4853-9f28-c81ab402290b\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.847602 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"69ab53db-b628-4853-9f28-c81ab402290b\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.847646 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ssh-key-openstack-edpm-ipam\") pod \"69ab53db-b628-4853-9f28-c81ab402290b\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.847714 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-nova-combined-ca-bundle\") pod \"69ab53db-b628-4853-9f28-c81ab402290b\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.847735 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plzgb\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-kube-api-access-plzgb\") pod \"69ab53db-b628-4853-9f28-c81ab402290b\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.847771 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-repo-setup-combined-ca-bundle\") pod \"69ab53db-b628-4853-9f28-c81ab402290b\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.847788 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ovn-combined-ca-bundle\") pod \"69ab53db-b628-4853-9f28-c81ab402290b\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.847836 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-bootstrap-combined-ca-bundle\") pod \"69ab53db-b628-4853-9f28-c81ab402290b\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.847883 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ceph\") pod \"69ab53db-b628-4853-9f28-c81ab402290b\" (UID: \"69ab53db-b628-4853-9f28-c81ab402290b\") " Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.853961 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "69ab53db-b628-4853-9f28-c81ab402290b" (UID: "69ab53db-b628-4853-9f28-c81ab402290b"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.854340 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "69ab53db-b628-4853-9f28-c81ab402290b" (UID: "69ab53db-b628-4853-9f28-c81ab402290b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.854371 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ceph" (OuterVolumeSpecName: "ceph") pod "69ab53db-b628-4853-9f28-c81ab402290b" (UID: "69ab53db-b628-4853-9f28-c81ab402290b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.854746 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "69ab53db-b628-4853-9f28-c81ab402290b" (UID: "69ab53db-b628-4853-9f28-c81ab402290b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.855241 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "69ab53db-b628-4853-9f28-c81ab402290b" (UID: "69ab53db-b628-4853-9f28-c81ab402290b"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.855409 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "69ab53db-b628-4853-9f28-c81ab402290b" (UID: "69ab53db-b628-4853-9f28-c81ab402290b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.855754 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "69ab53db-b628-4853-9f28-c81ab402290b" (UID: "69ab53db-b628-4853-9f28-c81ab402290b"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.856539 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-kube-api-access-plzgb" (OuterVolumeSpecName: "kube-api-access-plzgb") pod "69ab53db-b628-4853-9f28-c81ab402290b" (UID: "69ab53db-b628-4853-9f28-c81ab402290b"). InnerVolumeSpecName "kube-api-access-plzgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.856627 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "69ab53db-b628-4853-9f28-c81ab402290b" (UID: "69ab53db-b628-4853-9f28-c81ab402290b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.857047 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "69ab53db-b628-4853-9f28-c81ab402290b" (UID: "69ab53db-b628-4853-9f28-c81ab402290b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.858156 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "69ab53db-b628-4853-9f28-c81ab402290b" (UID: "69ab53db-b628-4853-9f28-c81ab402290b"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.882001 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-inventory" (OuterVolumeSpecName: "inventory") pod "69ab53db-b628-4853-9f28-c81ab402290b" (UID: "69ab53db-b628-4853-9f28-c81ab402290b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.886457 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "69ab53db-b628-4853-9f28-c81ab402290b" (UID: "69ab53db-b628-4853-9f28-c81ab402290b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.951801 4992 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.951859 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plzgb\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-kube-api-access-plzgb\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.951872 4992 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.951884 4992 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.951899 4992 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.951912 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.951926 4992 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.951939 4992 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.951953 4992 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.951980 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.951992 4992 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.952004 4992 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/69ab53db-b628-4853-9f28-c81ab402290b-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:26 crc kubenswrapper[4992]: I0131 10:05:26.952018 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/69ab53db-b628-4853-9f28-c81ab402290b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.414962 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" event={"ID":"69ab53db-b628-4853-9f28-c81ab402290b","Type":"ContainerDied","Data":"db11c301e957dc1060c0db849abbcc5f1bd5f0a206060f3ddb973d9d640f140e"} Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.415012 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db11c301e957dc1060c0db849abbcc5f1bd5f0a206060f3ddb973d9d640f140e" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.415037 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.523186 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc"] Jan 31 10:05:27 crc kubenswrapper[4992]: E0131 10:05:27.523594 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ab53db-b628-4853-9f28-c81ab402290b" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.523610 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ab53db-b628-4853-9f28-c81ab402290b" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.523782 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ab53db-b628-4853-9f28-c81ab402290b" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.524359 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.528844 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.528866 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.529002 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.529076 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.529130 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.533452 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc"] Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.665023 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.665083 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.665207 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r2cm\" (UniqueName: \"kubernetes.io/projected/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-kube-api-access-2r2cm\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.665296 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.767087 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.767226 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.767247 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.767331 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r2cm\" (UniqueName: \"kubernetes.io/projected/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-kube-api-access-2r2cm\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.772344 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.773908 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.774234 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.792592 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r2cm\" (UniqueName: \"kubernetes.io/projected/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-kube-api-access-2r2cm\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:27 crc kubenswrapper[4992]: I0131 10:05:27.865055 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:28 crc kubenswrapper[4992]: W0131 10:05:28.409158 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef20b86c_51f8_49eb_91f8_d9229b97cdbf.slice/crio-27292e7a7d6065316f3aa0bbb7a2dee8434f835d0d8b4a860744c9b34a6bbd9c WatchSource:0}: Error finding container 27292e7a7d6065316f3aa0bbb7a2dee8434f835d0d8b4a860744c9b34a6bbd9c: Status 404 returned error can't find the container with id 27292e7a7d6065316f3aa0bbb7a2dee8434f835d0d8b4a860744c9b34a6bbd9c Jan 31 10:05:28 crc kubenswrapper[4992]: I0131 10:05:28.412534 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc"] Jan 31 10:05:28 crc kubenswrapper[4992]: I0131 10:05:28.429231 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" event={"ID":"ef20b86c-51f8-49eb-91f8-d9229b97cdbf","Type":"ContainerStarted","Data":"27292e7a7d6065316f3aa0bbb7a2dee8434f835d0d8b4a860744c9b34a6bbd9c"} Jan 31 10:05:29 crc kubenswrapper[4992]: I0131 10:05:29.438963 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" event={"ID":"ef20b86c-51f8-49eb-91f8-d9229b97cdbf","Type":"ContainerStarted","Data":"f2d6b744dd46983c977d0dfbb7c74cb10d281c9c5f8078a84428493624f63161"} Jan 31 10:05:29 crc kubenswrapper[4992]: I0131 10:05:29.460394 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" podStartSLOduration=2.004219938 podStartE2EDuration="2.460369027s" podCreationTimestamp="2026-01-31 10:05:27 +0000 UTC" firstStartedPulling="2026-01-31 10:05:28.418746977 +0000 UTC m=+2424.390138964" lastFinishedPulling="2026-01-31 10:05:28.874896066 +0000 UTC m=+2424.846288053" observedRunningTime="2026-01-31 10:05:29.455669082 +0000 UTC m=+2425.427061089" watchObservedRunningTime="2026-01-31 10:05:29.460369027 +0000 UTC m=+2425.431761024" Jan 31 10:05:34 crc kubenswrapper[4992]: I0131 10:05:34.481654 4992 generic.go:334] "Generic (PLEG): container finished" podID="ef20b86c-51f8-49eb-91f8-d9229b97cdbf" containerID="f2d6b744dd46983c977d0dfbb7c74cb10d281c9c5f8078a84428493624f63161" exitCode=0 Jan 31 10:05:34 crc kubenswrapper[4992]: I0131 10:05:34.481777 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" event={"ID":"ef20b86c-51f8-49eb-91f8-d9229b97cdbf","Type":"ContainerDied","Data":"f2d6b744dd46983c977d0dfbb7c74cb10d281c9c5f8078a84428493624f63161"} Jan 31 10:05:35 crc kubenswrapper[4992]: I0131 10:05:35.899371 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.018409 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-ceph\") pod \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.018517 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r2cm\" (UniqueName: \"kubernetes.io/projected/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-kube-api-access-2r2cm\") pod \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.018626 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-inventory\") pod \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.018720 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-ssh-key-openstack-edpm-ipam\") pod \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\" (UID: \"ef20b86c-51f8-49eb-91f8-d9229b97cdbf\") " Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.024187 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-ceph" (OuterVolumeSpecName: "ceph") pod "ef20b86c-51f8-49eb-91f8-d9229b97cdbf" (UID: "ef20b86c-51f8-49eb-91f8-d9229b97cdbf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.024236 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-kube-api-access-2r2cm" (OuterVolumeSpecName: "kube-api-access-2r2cm") pod "ef20b86c-51f8-49eb-91f8-d9229b97cdbf" (UID: "ef20b86c-51f8-49eb-91f8-d9229b97cdbf"). InnerVolumeSpecName "kube-api-access-2r2cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.043822 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-inventory" (OuterVolumeSpecName: "inventory") pod "ef20b86c-51f8-49eb-91f8-d9229b97cdbf" (UID: "ef20b86c-51f8-49eb-91f8-d9229b97cdbf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.046202 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ef20b86c-51f8-49eb-91f8-d9229b97cdbf" (UID: "ef20b86c-51f8-49eb-91f8-d9229b97cdbf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.121263 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r2cm\" (UniqueName: \"kubernetes.io/projected/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-kube-api-access-2r2cm\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.121333 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.121349 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.121360 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ef20b86c-51f8-49eb-91f8-d9229b97cdbf-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.496341 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" event={"ID":"ef20b86c-51f8-49eb-91f8-d9229b97cdbf","Type":"ContainerDied","Data":"27292e7a7d6065316f3aa0bbb7a2dee8434f835d0d8b4a860744c9b34a6bbd9c"} Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.496688 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27292e7a7d6065316f3aa0bbb7a2dee8434f835d0d8b4a860744c9b34a6bbd9c" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.496427 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.603614 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r"] Jan 31 10:05:36 crc kubenswrapper[4992]: E0131 10:05:36.604110 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef20b86c-51f8-49eb-91f8-d9229b97cdbf" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.604134 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef20b86c-51f8-49eb-91f8-d9229b97cdbf" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.604350 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef20b86c-51f8-49eb-91f8-d9229b97cdbf" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.605162 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.608439 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.608440 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.608584 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.609006 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.613517 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.613796 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.614119 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r"] Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.734027 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/aac5fc1b-1bde-4799-bf43-efe86969c792-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.734324 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.734478 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stjvj\" (UniqueName: \"kubernetes.io/projected/aac5fc1b-1bde-4799-bf43-efe86969c792-kube-api-access-stjvj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.734651 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.734848 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.734897 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.836677 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stjvj\" (UniqueName: \"kubernetes.io/projected/aac5fc1b-1bde-4799-bf43-efe86969c792-kube-api-access-stjvj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.836777 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.836829 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.836853 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.836925 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/aac5fc1b-1bde-4799-bf43-efe86969c792-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.836967 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.838586 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/aac5fc1b-1bde-4799-bf43-efe86969c792-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.842439 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.842691 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.843260 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.844691 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.855889 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stjvj\" (UniqueName: \"kubernetes.io/projected/aac5fc1b-1bde-4799-bf43-efe86969c792-kube-api-access-stjvj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cbm9r\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:36 crc kubenswrapper[4992]: I0131 10:05:36.924803 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:05:37 crc kubenswrapper[4992]: I0131 10:05:37.183406 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:05:37 crc kubenswrapper[4992]: E0131 10:05:37.183966 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:05:37 crc kubenswrapper[4992]: I0131 10:05:37.420212 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r"] Jan 31 10:05:37 crc kubenswrapper[4992]: W0131 10:05:37.426025 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaac5fc1b_1bde_4799_bf43_efe86969c792.slice/crio-29a636ce6589728753c9b465a2709f2e699ce8df5e70f511d9443e07fe731462 WatchSource:0}: Error finding container 29a636ce6589728753c9b465a2709f2e699ce8df5e70f511d9443e07fe731462: Status 404 returned error can't find the container with id 29a636ce6589728753c9b465a2709f2e699ce8df5e70f511d9443e07fe731462 Jan 31 10:05:37 crc kubenswrapper[4992]: I0131 10:05:37.505302 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" event={"ID":"aac5fc1b-1bde-4799-bf43-efe86969c792","Type":"ContainerStarted","Data":"29a636ce6589728753c9b465a2709f2e699ce8df5e70f511d9443e07fe731462"} Jan 31 10:05:38 crc kubenswrapper[4992]: I0131 10:05:38.516395 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" event={"ID":"aac5fc1b-1bde-4799-bf43-efe86969c792","Type":"ContainerStarted","Data":"a2fe3f04ea4d91f43301dd6484bb6a2eeef0a8fb3feb8eb0da7ae4f42f058f5a"} Jan 31 10:05:38 crc kubenswrapper[4992]: I0131 10:05:38.539970 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" podStartSLOduration=2.009267362 podStartE2EDuration="2.539945877s" podCreationTimestamp="2026-01-31 10:05:36 +0000 UTC" firstStartedPulling="2026-01-31 10:05:37.429971459 +0000 UTC m=+2433.401363456" lastFinishedPulling="2026-01-31 10:05:37.960649964 +0000 UTC m=+2433.932041971" observedRunningTime="2026-01-31 10:05:38.533849701 +0000 UTC m=+2434.505241718" watchObservedRunningTime="2026-01-31 10:05:38.539945877 +0000 UTC m=+2434.511337864" Jan 31 10:05:49 crc kubenswrapper[4992]: I0131 10:05:49.182742 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:05:49 crc kubenswrapper[4992]: E0131 10:05:49.183308 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:06:01 crc kubenswrapper[4992]: I0131 10:06:01.182926 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:06:01 crc kubenswrapper[4992]: E0131 10:06:01.183725 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:06:15 crc kubenswrapper[4992]: I0131 10:06:15.194576 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:06:15 crc kubenswrapper[4992]: E0131 10:06:15.196256 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:06:27 crc kubenswrapper[4992]: I0131 10:06:27.183280 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:06:27 crc kubenswrapper[4992]: E0131 10:06:27.184117 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:06:38 crc kubenswrapper[4992]: I0131 10:06:38.183476 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:06:38 crc kubenswrapper[4992]: E0131 10:06:38.184469 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:06:41 crc kubenswrapper[4992]: I0131 10:06:41.697168 4992 generic.go:334] "Generic (PLEG): container finished" podID="aac5fc1b-1bde-4799-bf43-efe86969c792" containerID="a2fe3f04ea4d91f43301dd6484bb6a2eeef0a8fb3feb8eb0da7ae4f42f058f5a" exitCode=0 Jan 31 10:06:41 crc kubenswrapper[4992]: I0131 10:06:41.697230 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" event={"ID":"aac5fc1b-1bde-4799-bf43-efe86969c792","Type":"ContainerDied","Data":"a2fe3f04ea4d91f43301dd6484bb6a2eeef0a8fb3feb8eb0da7ae4f42f058f5a"} Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.146823 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.191801 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-inventory\") pod \"aac5fc1b-1bde-4799-bf43-efe86969c792\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.191939 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stjvj\" (UniqueName: \"kubernetes.io/projected/aac5fc1b-1bde-4799-bf43-efe86969c792-kube-api-access-stjvj\") pod \"aac5fc1b-1bde-4799-bf43-efe86969c792\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.191968 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ovn-combined-ca-bundle\") pod \"aac5fc1b-1bde-4799-bf43-efe86969c792\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.192057 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/aac5fc1b-1bde-4799-bf43-efe86969c792-ovncontroller-config-0\") pod \"aac5fc1b-1bde-4799-bf43-efe86969c792\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.192081 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ssh-key-openstack-edpm-ipam\") pod \"aac5fc1b-1bde-4799-bf43-efe86969c792\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.192114 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ceph\") pod \"aac5fc1b-1bde-4799-bf43-efe86969c792\" (UID: \"aac5fc1b-1bde-4799-bf43-efe86969c792\") " Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.198184 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "aac5fc1b-1bde-4799-bf43-efe86969c792" (UID: "aac5fc1b-1bde-4799-bf43-efe86969c792"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.198715 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ceph" (OuterVolumeSpecName: "ceph") pod "aac5fc1b-1bde-4799-bf43-efe86969c792" (UID: "aac5fc1b-1bde-4799-bf43-efe86969c792"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.218140 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aac5fc1b-1bde-4799-bf43-efe86969c792-kube-api-access-stjvj" (OuterVolumeSpecName: "kube-api-access-stjvj") pod "aac5fc1b-1bde-4799-bf43-efe86969c792" (UID: "aac5fc1b-1bde-4799-bf43-efe86969c792"). InnerVolumeSpecName "kube-api-access-stjvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.226236 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-inventory" (OuterVolumeSpecName: "inventory") pod "aac5fc1b-1bde-4799-bf43-efe86969c792" (UID: "aac5fc1b-1bde-4799-bf43-efe86969c792"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.226637 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aac5fc1b-1bde-4799-bf43-efe86969c792-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "aac5fc1b-1bde-4799-bf43-efe86969c792" (UID: "aac5fc1b-1bde-4799-bf43-efe86969c792"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.227062 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "aac5fc1b-1bde-4799-bf43-efe86969c792" (UID: "aac5fc1b-1bde-4799-bf43-efe86969c792"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.295190 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stjvj\" (UniqueName: \"kubernetes.io/projected/aac5fc1b-1bde-4799-bf43-efe86969c792-kube-api-access-stjvj\") on node \"crc\" DevicePath \"\"" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.295238 4992 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.295253 4992 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/aac5fc1b-1bde-4799-bf43-efe86969c792-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.295269 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.295282 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.295295 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aac5fc1b-1bde-4799-bf43-efe86969c792-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.719690 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" event={"ID":"aac5fc1b-1bde-4799-bf43-efe86969c792","Type":"ContainerDied","Data":"29a636ce6589728753c9b465a2709f2e699ce8df5e70f511d9443e07fe731462"} Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.720143 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29a636ce6589728753c9b465a2709f2e699ce8df5e70f511d9443e07fe731462" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.719932 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cbm9r" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.815332 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg"] Jan 31 10:06:43 crc kubenswrapper[4992]: E0131 10:06:43.815804 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac5fc1b-1bde-4799-bf43-efe86969c792" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.815816 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac5fc1b-1bde-4799-bf43-efe86969c792" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.816031 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="aac5fc1b-1bde-4799-bf43-efe86969c792" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.816713 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.822033 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.822249 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.822411 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.822535 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.822671 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.822773 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.822886 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.830565 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg"] Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.905021 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.905095 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.905153 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgpwk\" (UniqueName: \"kubernetes.io/projected/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-kube-api-access-vgpwk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.905183 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.905211 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.905252 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:43 crc kubenswrapper[4992]: I0131 10:06:43.905289 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.006707 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.007061 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.007714 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.007885 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.008164 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.008335 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.008550 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgpwk\" (UniqueName: \"kubernetes.io/projected/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-kube-api-access-vgpwk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.011172 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.011286 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.012203 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.015562 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.019913 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.021649 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.028534 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgpwk\" (UniqueName: \"kubernetes.io/projected/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-kube-api-access-vgpwk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.140708 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.687016 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg"] Jan 31 10:06:44 crc kubenswrapper[4992]: W0131 10:06:44.689990 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded4f9a40_f9be_433b_990f_8ad0ec1d8c79.slice/crio-ade58a72eba0f60dfa7a21493ef7fb683f101ba1b3a581c389a4722b29e756d0 WatchSource:0}: Error finding container ade58a72eba0f60dfa7a21493ef7fb683f101ba1b3a581c389a4722b29e756d0: Status 404 returned error can't find the container with id ade58a72eba0f60dfa7a21493ef7fb683f101ba1b3a581c389a4722b29e756d0 Jan 31 10:06:44 crc kubenswrapper[4992]: I0131 10:06:44.746447 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" event={"ID":"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79","Type":"ContainerStarted","Data":"ade58a72eba0f60dfa7a21493ef7fb683f101ba1b3a581c389a4722b29e756d0"} Jan 31 10:06:45 crc kubenswrapper[4992]: I0131 10:06:45.755015 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" event={"ID":"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79","Type":"ContainerStarted","Data":"d62b6448ef6a8e3eca955ced018381c96891dafe76bad805b10f470fb773b6d0"} Jan 31 10:06:45 crc kubenswrapper[4992]: I0131 10:06:45.776311 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" podStartSLOduration=2.302217168 podStartE2EDuration="2.776287493s" podCreationTimestamp="2026-01-31 10:06:43 +0000 UTC" firstStartedPulling="2026-01-31 10:06:44.691755598 +0000 UTC m=+2500.663147585" lastFinishedPulling="2026-01-31 10:06:45.165825923 +0000 UTC m=+2501.137217910" observedRunningTime="2026-01-31 10:06:45.770669951 +0000 UTC m=+2501.742061948" watchObservedRunningTime="2026-01-31 10:06:45.776287493 +0000 UTC m=+2501.747679480" Jan 31 10:06:51 crc kubenswrapper[4992]: I0131 10:06:51.183316 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:06:51 crc kubenswrapper[4992]: E0131 10:06:51.184682 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:07:02 crc kubenswrapper[4992]: I0131 10:07:02.183360 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:07:02 crc kubenswrapper[4992]: E0131 10:07:02.184119 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:07:16 crc kubenswrapper[4992]: I0131 10:07:16.182700 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:07:16 crc kubenswrapper[4992]: E0131 10:07:16.185413 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:07:31 crc kubenswrapper[4992]: I0131 10:07:31.182684 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:07:31 crc kubenswrapper[4992]: E0131 10:07:31.183815 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:07:38 crc kubenswrapper[4992]: I0131 10:07:38.599832 4992 generic.go:334] "Generic (PLEG): container finished" podID="ed4f9a40-f9be-433b-990f-8ad0ec1d8c79" containerID="d62b6448ef6a8e3eca955ced018381c96891dafe76bad805b10f470fb773b6d0" exitCode=0 Jan 31 10:07:38 crc kubenswrapper[4992]: I0131 10:07:38.600580 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" event={"ID":"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79","Type":"ContainerDied","Data":"d62b6448ef6a8e3eca955ced018381c96891dafe76bad805b10f470fb773b6d0"} Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.018334 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.111448 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-nova-metadata-neutron-config-0\") pod \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.111545 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-inventory\") pod \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.111577 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.111605 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-ssh-key-openstack-edpm-ipam\") pod \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.111639 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-ceph\") pod \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.112305 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgpwk\" (UniqueName: \"kubernetes.io/projected/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-kube-api-access-vgpwk\") pod \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.112466 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-neutron-metadata-combined-ca-bundle\") pod \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\" (UID: \"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79\") " Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.117045 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-ceph" (OuterVolumeSpecName: "ceph") pod "ed4f9a40-f9be-433b-990f-8ad0ec1d8c79" (UID: "ed4f9a40-f9be-433b-990f-8ad0ec1d8c79"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.127529 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-kube-api-access-vgpwk" (OuterVolumeSpecName: "kube-api-access-vgpwk") pod "ed4f9a40-f9be-433b-990f-8ad0ec1d8c79" (UID: "ed4f9a40-f9be-433b-990f-8ad0ec1d8c79"). InnerVolumeSpecName "kube-api-access-vgpwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.127692 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ed4f9a40-f9be-433b-990f-8ad0ec1d8c79" (UID: "ed4f9a40-f9be-433b-990f-8ad0ec1d8c79"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.138617 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ed4f9a40-f9be-433b-990f-8ad0ec1d8c79" (UID: "ed4f9a40-f9be-433b-990f-8ad0ec1d8c79"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.141466 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ed4f9a40-f9be-433b-990f-8ad0ec1d8c79" (UID: "ed4f9a40-f9be-433b-990f-8ad0ec1d8c79"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.145699 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ed4f9a40-f9be-433b-990f-8ad0ec1d8c79" (UID: "ed4f9a40-f9be-433b-990f-8ad0ec1d8c79"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.147291 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-inventory" (OuterVolumeSpecName: "inventory") pod "ed4f9a40-f9be-433b-990f-8ad0ec1d8c79" (UID: "ed4f9a40-f9be-433b-990f-8ad0ec1d8c79"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.214476 4992 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.214511 4992 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.214528 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.214543 4992 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.214556 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.214568 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.214580 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgpwk\" (UniqueName: \"kubernetes.io/projected/ed4f9a40-f9be-433b-990f-8ad0ec1d8c79-kube-api-access-vgpwk\") on node \"crc\" DevicePath \"\"" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.619007 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" event={"ID":"ed4f9a40-f9be-433b-990f-8ad0ec1d8c79","Type":"ContainerDied","Data":"ade58a72eba0f60dfa7a21493ef7fb683f101ba1b3a581c389a4722b29e756d0"} Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.619049 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ade58a72eba0f60dfa7a21493ef7fb683f101ba1b3a581c389a4722b29e756d0" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.619077 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.741049 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn"] Jan 31 10:07:40 crc kubenswrapper[4992]: E0131 10:07:40.743651 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed4f9a40-f9be-433b-990f-8ad0ec1d8c79" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.743675 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed4f9a40-f9be-433b-990f-8ad0ec1d8c79" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.744063 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed4f9a40-f9be-433b-990f-8ad0ec1d8c79" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.746752 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.749285 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.749318 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.750740 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.751346 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.752632 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.754062 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.757451 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn"] Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.824350 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttfjl\" (UniqueName: \"kubernetes.io/projected/a64d87fc-267b-4505-a807-aa020492685c-kube-api-access-ttfjl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.824406 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.824463 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.824489 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.824528 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.824548 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.926469 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.926565 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.926621 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.926672 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.926711 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.926915 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttfjl\" (UniqueName: \"kubernetes.io/projected/a64d87fc-267b-4505-a807-aa020492685c-kube-api-access-ttfjl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.931540 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.931539 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.932072 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.932315 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.933480 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:40 crc kubenswrapper[4992]: I0131 10:07:40.944903 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttfjl\" (UniqueName: \"kubernetes.io/projected/a64d87fc-267b-4505-a807-aa020492685c-kube-api-access-ttfjl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:41 crc kubenswrapper[4992]: I0131 10:07:41.084113 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:07:41 crc kubenswrapper[4992]: I0131 10:07:41.605411 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn"] Jan 31 10:07:41 crc kubenswrapper[4992]: I0131 10:07:41.627363 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" event={"ID":"a64d87fc-267b-4505-a807-aa020492685c","Type":"ContainerStarted","Data":"ade4d8ded6a1dc860ccf98b6bb94d9d805d83b3f16eba2c18a2db07997be2bcb"} Jan 31 10:07:42 crc kubenswrapper[4992]: I0131 10:07:42.182494 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:07:42 crc kubenswrapper[4992]: E0131 10:07:42.183142 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:07:42 crc kubenswrapper[4992]: I0131 10:07:42.642438 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" event={"ID":"a64d87fc-267b-4505-a807-aa020492685c","Type":"ContainerStarted","Data":"2f8e0f335b394311cce7f0072d230e087b923b96275b671368e6527df7e1632d"} Jan 31 10:07:42 crc kubenswrapper[4992]: I0131 10:07:42.684106 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" podStartSLOduration=2.259707058 podStartE2EDuration="2.684081632s" podCreationTimestamp="2026-01-31 10:07:40 +0000 UTC" firstStartedPulling="2026-01-31 10:07:41.613000394 +0000 UTC m=+2557.584392401" lastFinishedPulling="2026-01-31 10:07:42.037374988 +0000 UTC m=+2558.008766975" observedRunningTime="2026-01-31 10:07:42.665000803 +0000 UTC m=+2558.636392840" watchObservedRunningTime="2026-01-31 10:07:42.684081632 +0000 UTC m=+2558.655473679" Jan 31 10:07:54 crc kubenswrapper[4992]: I0131 10:07:54.183141 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:07:54 crc kubenswrapper[4992]: E0131 10:07:54.185847 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:08:08 crc kubenswrapper[4992]: I0131 10:08:08.183277 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:08:08 crc kubenswrapper[4992]: E0131 10:08:08.184294 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:08:21 crc kubenswrapper[4992]: I0131 10:08:21.182809 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:08:21 crc kubenswrapper[4992]: I0131 10:08:21.978764 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"dcbcc87f018c90f071ceba0faad9cc5c81ec174891a9b84e3c799f853950d984"} Jan 31 10:09:59 crc kubenswrapper[4992]: I0131 10:09:59.057664 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v75bb"] Jan 31 10:09:59 crc kubenswrapper[4992]: I0131 10:09:59.062072 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:09:59 crc kubenswrapper[4992]: I0131 10:09:59.092781 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v75bb"] Jan 31 10:09:59 crc kubenswrapper[4992]: I0131 10:09:59.175103 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c3cc2f8-718c-4f12-82bb-59857093104c-utilities\") pod \"redhat-operators-v75bb\" (UID: \"9c3cc2f8-718c-4f12-82bb-59857093104c\") " pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:09:59 crc kubenswrapper[4992]: I0131 10:09:59.175152 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b2sk\" (UniqueName: \"kubernetes.io/projected/9c3cc2f8-718c-4f12-82bb-59857093104c-kube-api-access-8b2sk\") pod \"redhat-operators-v75bb\" (UID: \"9c3cc2f8-718c-4f12-82bb-59857093104c\") " pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:09:59 crc kubenswrapper[4992]: I0131 10:09:59.175194 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c3cc2f8-718c-4f12-82bb-59857093104c-catalog-content\") pod \"redhat-operators-v75bb\" (UID: \"9c3cc2f8-718c-4f12-82bb-59857093104c\") " pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:09:59 crc kubenswrapper[4992]: I0131 10:09:59.276575 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c3cc2f8-718c-4f12-82bb-59857093104c-utilities\") pod \"redhat-operators-v75bb\" (UID: \"9c3cc2f8-718c-4f12-82bb-59857093104c\") " pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:09:59 crc kubenswrapper[4992]: I0131 10:09:59.276622 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b2sk\" (UniqueName: \"kubernetes.io/projected/9c3cc2f8-718c-4f12-82bb-59857093104c-kube-api-access-8b2sk\") pod \"redhat-operators-v75bb\" (UID: \"9c3cc2f8-718c-4f12-82bb-59857093104c\") " pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:09:59 crc kubenswrapper[4992]: I0131 10:09:59.276661 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c3cc2f8-718c-4f12-82bb-59857093104c-catalog-content\") pod \"redhat-operators-v75bb\" (UID: \"9c3cc2f8-718c-4f12-82bb-59857093104c\") " pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:09:59 crc kubenswrapper[4992]: I0131 10:09:59.277162 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c3cc2f8-718c-4f12-82bb-59857093104c-catalog-content\") pod \"redhat-operators-v75bb\" (UID: \"9c3cc2f8-718c-4f12-82bb-59857093104c\") " pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:09:59 crc kubenswrapper[4992]: I0131 10:09:59.277500 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c3cc2f8-718c-4f12-82bb-59857093104c-utilities\") pod \"redhat-operators-v75bb\" (UID: \"9c3cc2f8-718c-4f12-82bb-59857093104c\") " pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:09:59 crc kubenswrapper[4992]: I0131 10:09:59.299259 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b2sk\" (UniqueName: \"kubernetes.io/projected/9c3cc2f8-718c-4f12-82bb-59857093104c-kube-api-access-8b2sk\") pod \"redhat-operators-v75bb\" (UID: \"9c3cc2f8-718c-4f12-82bb-59857093104c\") " pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:09:59 crc kubenswrapper[4992]: I0131 10:09:59.404843 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:09:59 crc kubenswrapper[4992]: I0131 10:09:59.846879 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v75bb"] Jan 31 10:09:59 crc kubenswrapper[4992]: I0131 10:09:59.880102 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v75bb" event={"ID":"9c3cc2f8-718c-4f12-82bb-59857093104c","Type":"ContainerStarted","Data":"d632a793525ff03f620f904624573475560d4a76f1facc9a47b3235ad4b62d11"} Jan 31 10:10:00 crc kubenswrapper[4992]: I0131 10:10:00.889406 4992 generic.go:334] "Generic (PLEG): container finished" podID="9c3cc2f8-718c-4f12-82bb-59857093104c" containerID="4f88a1b24771ee288e28f148d5340654e6e4998f8dc7dffc2b2d6d04b44c875b" exitCode=0 Jan 31 10:10:00 crc kubenswrapper[4992]: I0131 10:10:00.889584 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v75bb" event={"ID":"9c3cc2f8-718c-4f12-82bb-59857093104c","Type":"ContainerDied","Data":"4f88a1b24771ee288e28f148d5340654e6e4998f8dc7dffc2b2d6d04b44c875b"} Jan 31 10:10:00 crc kubenswrapper[4992]: I0131 10:10:00.891782 4992 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 10:10:01 crc kubenswrapper[4992]: I0131 10:10:01.909632 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v75bb" event={"ID":"9c3cc2f8-718c-4f12-82bb-59857093104c","Type":"ContainerStarted","Data":"8c360d9c0a980408d7e01d39c45ff04485c8a182b75dbec4a21987f16a17cacc"} Jan 31 10:10:02 crc kubenswrapper[4992]: I0131 10:10:02.921749 4992 generic.go:334] "Generic (PLEG): container finished" podID="9c3cc2f8-718c-4f12-82bb-59857093104c" containerID="8c360d9c0a980408d7e01d39c45ff04485c8a182b75dbec4a21987f16a17cacc" exitCode=0 Jan 31 10:10:02 crc kubenswrapper[4992]: I0131 10:10:02.921803 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v75bb" event={"ID":"9c3cc2f8-718c-4f12-82bb-59857093104c","Type":"ContainerDied","Data":"8c360d9c0a980408d7e01d39c45ff04485c8a182b75dbec4a21987f16a17cacc"} Jan 31 10:10:03 crc kubenswrapper[4992]: I0131 10:10:03.936945 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v75bb" event={"ID":"9c3cc2f8-718c-4f12-82bb-59857093104c","Type":"ContainerStarted","Data":"fb503841864799671cb0e9429f8de3c4b1755eee3923bc21c9639922a61d3634"} Jan 31 10:10:03 crc kubenswrapper[4992]: I0131 10:10:03.965706 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v75bb" podStartSLOduration=2.384457374 podStartE2EDuration="4.965552324s" podCreationTimestamp="2026-01-31 10:09:59 +0000 UTC" firstStartedPulling="2026-01-31 10:10:00.891382643 +0000 UTC m=+2696.862774640" lastFinishedPulling="2026-01-31 10:10:03.472477593 +0000 UTC m=+2699.443869590" observedRunningTime="2026-01-31 10:10:03.955226727 +0000 UTC m=+2699.926618724" watchObservedRunningTime="2026-01-31 10:10:03.965552324 +0000 UTC m=+2699.936944311" Jan 31 10:10:09 crc kubenswrapper[4992]: I0131 10:10:09.405436 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:10:09 crc kubenswrapper[4992]: I0131 10:10:09.405898 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:10:09 crc kubenswrapper[4992]: I0131 10:10:09.456511 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:10:10 crc kubenswrapper[4992]: I0131 10:10:10.048492 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:10:10 crc kubenswrapper[4992]: I0131 10:10:10.089734 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v75bb"] Jan 31 10:10:12 crc kubenswrapper[4992]: I0131 10:10:12.001752 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v75bb" podUID="9c3cc2f8-718c-4f12-82bb-59857093104c" containerName="registry-server" containerID="cri-o://fb503841864799671cb0e9429f8de3c4b1755eee3923bc21c9639922a61d3634" gracePeriod=2 Jan 31 10:10:12 crc kubenswrapper[4992]: I0131 10:10:12.442265 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:10:12 crc kubenswrapper[4992]: I0131 10:10:12.539651 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c3cc2f8-718c-4f12-82bb-59857093104c-utilities\") pod \"9c3cc2f8-718c-4f12-82bb-59857093104c\" (UID: \"9c3cc2f8-718c-4f12-82bb-59857093104c\") " Jan 31 10:10:12 crc kubenswrapper[4992]: I0131 10:10:12.539723 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b2sk\" (UniqueName: \"kubernetes.io/projected/9c3cc2f8-718c-4f12-82bb-59857093104c-kube-api-access-8b2sk\") pod \"9c3cc2f8-718c-4f12-82bb-59857093104c\" (UID: \"9c3cc2f8-718c-4f12-82bb-59857093104c\") " Jan 31 10:10:12 crc kubenswrapper[4992]: I0131 10:10:12.539788 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c3cc2f8-718c-4f12-82bb-59857093104c-catalog-content\") pod \"9c3cc2f8-718c-4f12-82bb-59857093104c\" (UID: \"9c3cc2f8-718c-4f12-82bb-59857093104c\") " Jan 31 10:10:12 crc kubenswrapper[4992]: I0131 10:10:12.541141 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c3cc2f8-718c-4f12-82bb-59857093104c-utilities" (OuterVolumeSpecName: "utilities") pod "9c3cc2f8-718c-4f12-82bb-59857093104c" (UID: "9c3cc2f8-718c-4f12-82bb-59857093104c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:10:12 crc kubenswrapper[4992]: I0131 10:10:12.551634 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3cc2f8-718c-4f12-82bb-59857093104c-kube-api-access-8b2sk" (OuterVolumeSpecName: "kube-api-access-8b2sk") pod "9c3cc2f8-718c-4f12-82bb-59857093104c" (UID: "9c3cc2f8-718c-4f12-82bb-59857093104c"). InnerVolumeSpecName "kube-api-access-8b2sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:10:12 crc kubenswrapper[4992]: I0131 10:10:12.641997 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c3cc2f8-718c-4f12-82bb-59857093104c-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:10:12 crc kubenswrapper[4992]: I0131 10:10:12.642026 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b2sk\" (UniqueName: \"kubernetes.io/projected/9c3cc2f8-718c-4f12-82bb-59857093104c-kube-api-access-8b2sk\") on node \"crc\" DevicePath \"\"" Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.015503 4992 generic.go:334] "Generic (PLEG): container finished" podID="9c3cc2f8-718c-4f12-82bb-59857093104c" containerID="fb503841864799671cb0e9429f8de3c4b1755eee3923bc21c9639922a61d3634" exitCode=0 Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.015558 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v75bb" event={"ID":"9c3cc2f8-718c-4f12-82bb-59857093104c","Type":"ContainerDied","Data":"fb503841864799671cb0e9429f8de3c4b1755eee3923bc21c9639922a61d3634"} Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.015578 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v75bb" Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.015590 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v75bb" event={"ID":"9c3cc2f8-718c-4f12-82bb-59857093104c","Type":"ContainerDied","Data":"d632a793525ff03f620f904624573475560d4a76f1facc9a47b3235ad4b62d11"} Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.015610 4992 scope.go:117] "RemoveContainer" containerID="fb503841864799671cb0e9429f8de3c4b1755eee3923bc21c9639922a61d3634" Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.041145 4992 scope.go:117] "RemoveContainer" containerID="8c360d9c0a980408d7e01d39c45ff04485c8a182b75dbec4a21987f16a17cacc" Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.070701 4992 scope.go:117] "RemoveContainer" containerID="4f88a1b24771ee288e28f148d5340654e6e4998f8dc7dffc2b2d6d04b44c875b" Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.103370 4992 scope.go:117] "RemoveContainer" containerID="fb503841864799671cb0e9429f8de3c4b1755eee3923bc21c9639922a61d3634" Jan 31 10:10:13 crc kubenswrapper[4992]: E0131 10:10:13.103895 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb503841864799671cb0e9429f8de3c4b1755eee3923bc21c9639922a61d3634\": container with ID starting with fb503841864799671cb0e9429f8de3c4b1755eee3923bc21c9639922a61d3634 not found: ID does not exist" containerID="fb503841864799671cb0e9429f8de3c4b1755eee3923bc21c9639922a61d3634" Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.103933 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb503841864799671cb0e9429f8de3c4b1755eee3923bc21c9639922a61d3634"} err="failed to get container status \"fb503841864799671cb0e9429f8de3c4b1755eee3923bc21c9639922a61d3634\": rpc error: code = NotFound desc = could not find container \"fb503841864799671cb0e9429f8de3c4b1755eee3923bc21c9639922a61d3634\": container with ID starting with fb503841864799671cb0e9429f8de3c4b1755eee3923bc21c9639922a61d3634 not found: ID does not exist" Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.103958 4992 scope.go:117] "RemoveContainer" containerID="8c360d9c0a980408d7e01d39c45ff04485c8a182b75dbec4a21987f16a17cacc" Jan 31 10:10:13 crc kubenswrapper[4992]: E0131 10:10:13.104320 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c360d9c0a980408d7e01d39c45ff04485c8a182b75dbec4a21987f16a17cacc\": container with ID starting with 8c360d9c0a980408d7e01d39c45ff04485c8a182b75dbec4a21987f16a17cacc not found: ID does not exist" containerID="8c360d9c0a980408d7e01d39c45ff04485c8a182b75dbec4a21987f16a17cacc" Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.104349 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c360d9c0a980408d7e01d39c45ff04485c8a182b75dbec4a21987f16a17cacc"} err="failed to get container status \"8c360d9c0a980408d7e01d39c45ff04485c8a182b75dbec4a21987f16a17cacc\": rpc error: code = NotFound desc = could not find container \"8c360d9c0a980408d7e01d39c45ff04485c8a182b75dbec4a21987f16a17cacc\": container with ID starting with 8c360d9c0a980408d7e01d39c45ff04485c8a182b75dbec4a21987f16a17cacc not found: ID does not exist" Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.104388 4992 scope.go:117] "RemoveContainer" containerID="4f88a1b24771ee288e28f148d5340654e6e4998f8dc7dffc2b2d6d04b44c875b" Jan 31 10:10:13 crc kubenswrapper[4992]: E0131 10:10:13.104927 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f88a1b24771ee288e28f148d5340654e6e4998f8dc7dffc2b2d6d04b44c875b\": container with ID starting with 4f88a1b24771ee288e28f148d5340654e6e4998f8dc7dffc2b2d6d04b44c875b not found: ID does not exist" containerID="4f88a1b24771ee288e28f148d5340654e6e4998f8dc7dffc2b2d6d04b44c875b" Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.104953 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f88a1b24771ee288e28f148d5340654e6e4998f8dc7dffc2b2d6d04b44c875b"} err="failed to get container status \"4f88a1b24771ee288e28f148d5340654e6e4998f8dc7dffc2b2d6d04b44c875b\": rpc error: code = NotFound desc = could not find container \"4f88a1b24771ee288e28f148d5340654e6e4998f8dc7dffc2b2d6d04b44c875b\": container with ID starting with 4f88a1b24771ee288e28f148d5340654e6e4998f8dc7dffc2b2d6d04b44c875b not found: ID does not exist" Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.877512 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c3cc2f8-718c-4f12-82bb-59857093104c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c3cc2f8-718c-4f12-82bb-59857093104c" (UID: "9c3cc2f8-718c-4f12-82bb-59857093104c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.885556 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c3cc2f8-718c-4f12-82bb-59857093104c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.945719 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v75bb"] Jan 31 10:10:13 crc kubenswrapper[4992]: I0131 10:10:13.957894 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v75bb"] Jan 31 10:10:15 crc kubenswrapper[4992]: I0131 10:10:15.197122 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c3cc2f8-718c-4f12-82bb-59857093104c" path="/var/lib/kubelet/pods/9c3cc2f8-718c-4f12-82bb-59857093104c/volumes" Jan 31 10:10:45 crc kubenswrapper[4992]: I0131 10:10:45.300975 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:10:45 crc kubenswrapper[4992]: I0131 10:10:45.301393 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:11:00 crc kubenswrapper[4992]: I0131 10:11:00.805918 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g9pd2"] Jan 31 10:11:00 crc kubenswrapper[4992]: E0131 10:11:00.806869 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3cc2f8-718c-4f12-82bb-59857093104c" containerName="extract-utilities" Jan 31 10:11:00 crc kubenswrapper[4992]: I0131 10:11:00.806887 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3cc2f8-718c-4f12-82bb-59857093104c" containerName="extract-utilities" Jan 31 10:11:00 crc kubenswrapper[4992]: E0131 10:11:00.806926 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3cc2f8-718c-4f12-82bb-59857093104c" containerName="registry-server" Jan 31 10:11:00 crc kubenswrapper[4992]: I0131 10:11:00.806934 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3cc2f8-718c-4f12-82bb-59857093104c" containerName="registry-server" Jan 31 10:11:00 crc kubenswrapper[4992]: E0131 10:11:00.806951 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3cc2f8-718c-4f12-82bb-59857093104c" containerName="extract-content" Jan 31 10:11:00 crc kubenswrapper[4992]: I0131 10:11:00.806958 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3cc2f8-718c-4f12-82bb-59857093104c" containerName="extract-content" Jan 31 10:11:00 crc kubenswrapper[4992]: I0131 10:11:00.807194 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c3cc2f8-718c-4f12-82bb-59857093104c" containerName="registry-server" Jan 31 10:11:00 crc kubenswrapper[4992]: I0131 10:11:00.812974 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:00 crc kubenswrapper[4992]: I0131 10:11:00.819284 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g9pd2"] Jan 31 10:11:00 crc kubenswrapper[4992]: I0131 10:11:00.875150 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzz9z\" (UniqueName: \"kubernetes.io/projected/c225dcf5-b586-430a-9793-26946fe6e312-kube-api-access-xzz9z\") pod \"community-operators-g9pd2\" (UID: \"c225dcf5-b586-430a-9793-26946fe6e312\") " pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:00 crc kubenswrapper[4992]: I0131 10:11:00.875621 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c225dcf5-b586-430a-9793-26946fe6e312-catalog-content\") pod \"community-operators-g9pd2\" (UID: \"c225dcf5-b586-430a-9793-26946fe6e312\") " pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:00 crc kubenswrapper[4992]: I0131 10:11:00.875830 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c225dcf5-b586-430a-9793-26946fe6e312-utilities\") pod \"community-operators-g9pd2\" (UID: \"c225dcf5-b586-430a-9793-26946fe6e312\") " pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:00 crc kubenswrapper[4992]: I0131 10:11:00.977867 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c225dcf5-b586-430a-9793-26946fe6e312-catalog-content\") pod \"community-operators-g9pd2\" (UID: \"c225dcf5-b586-430a-9793-26946fe6e312\") " pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:00 crc kubenswrapper[4992]: I0131 10:11:00.978067 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c225dcf5-b586-430a-9793-26946fe6e312-utilities\") pod \"community-operators-g9pd2\" (UID: \"c225dcf5-b586-430a-9793-26946fe6e312\") " pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:00 crc kubenswrapper[4992]: I0131 10:11:00.978135 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzz9z\" (UniqueName: \"kubernetes.io/projected/c225dcf5-b586-430a-9793-26946fe6e312-kube-api-access-xzz9z\") pod \"community-operators-g9pd2\" (UID: \"c225dcf5-b586-430a-9793-26946fe6e312\") " pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:00 crc kubenswrapper[4992]: I0131 10:11:00.978518 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c225dcf5-b586-430a-9793-26946fe6e312-catalog-content\") pod \"community-operators-g9pd2\" (UID: \"c225dcf5-b586-430a-9793-26946fe6e312\") " pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:00 crc kubenswrapper[4992]: I0131 10:11:00.978707 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c225dcf5-b586-430a-9793-26946fe6e312-utilities\") pod \"community-operators-g9pd2\" (UID: \"c225dcf5-b586-430a-9793-26946fe6e312\") " pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:01 crc kubenswrapper[4992]: I0131 10:11:01.005391 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzz9z\" (UniqueName: \"kubernetes.io/projected/c225dcf5-b586-430a-9793-26946fe6e312-kube-api-access-xzz9z\") pod \"community-operators-g9pd2\" (UID: \"c225dcf5-b586-430a-9793-26946fe6e312\") " pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:01 crc kubenswrapper[4992]: I0131 10:11:01.143254 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:01 crc kubenswrapper[4992]: I0131 10:11:01.633730 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g9pd2"] Jan 31 10:11:01 crc kubenswrapper[4992]: W0131 10:11:01.640232 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc225dcf5_b586_430a_9793_26946fe6e312.slice/crio-6a4fae8f4935707f9bd0e45f0001fb913e20cd18ff294e851e12bb7e3d091b38 WatchSource:0}: Error finding container 6a4fae8f4935707f9bd0e45f0001fb913e20cd18ff294e851e12bb7e3d091b38: Status 404 returned error can't find the container with id 6a4fae8f4935707f9bd0e45f0001fb913e20cd18ff294e851e12bb7e3d091b38 Jan 31 10:11:02 crc kubenswrapper[4992]: I0131 10:11:02.441093 4992 generic.go:334] "Generic (PLEG): container finished" podID="c225dcf5-b586-430a-9793-26946fe6e312" containerID="8b161bf54e3d5858852323dfef5a98ca121072099a861d614c83f0b332939b2b" exitCode=0 Jan 31 10:11:02 crc kubenswrapper[4992]: I0131 10:11:02.441145 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9pd2" event={"ID":"c225dcf5-b586-430a-9793-26946fe6e312","Type":"ContainerDied","Data":"8b161bf54e3d5858852323dfef5a98ca121072099a861d614c83f0b332939b2b"} Jan 31 10:11:02 crc kubenswrapper[4992]: I0131 10:11:02.441177 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9pd2" event={"ID":"c225dcf5-b586-430a-9793-26946fe6e312","Type":"ContainerStarted","Data":"6a4fae8f4935707f9bd0e45f0001fb913e20cd18ff294e851e12bb7e3d091b38"} Jan 31 10:11:03 crc kubenswrapper[4992]: I0131 10:11:03.449642 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9pd2" event={"ID":"c225dcf5-b586-430a-9793-26946fe6e312","Type":"ContainerStarted","Data":"6be0607112f63ee4c3893517a211c8cb0108e26e5eb856e442678d90bf9495ad"} Jan 31 10:11:04 crc kubenswrapper[4992]: I0131 10:11:04.468350 4992 generic.go:334] "Generic (PLEG): container finished" podID="c225dcf5-b586-430a-9793-26946fe6e312" containerID="6be0607112f63ee4c3893517a211c8cb0108e26e5eb856e442678d90bf9495ad" exitCode=0 Jan 31 10:11:04 crc kubenswrapper[4992]: I0131 10:11:04.468407 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9pd2" event={"ID":"c225dcf5-b586-430a-9793-26946fe6e312","Type":"ContainerDied","Data":"6be0607112f63ee4c3893517a211c8cb0108e26e5eb856e442678d90bf9495ad"} Jan 31 10:11:05 crc kubenswrapper[4992]: I0131 10:11:05.478548 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9pd2" event={"ID":"c225dcf5-b586-430a-9793-26946fe6e312","Type":"ContainerStarted","Data":"46b2192f8256af9f8eb8e3e5c0cb532f9771c64c9202a55eb949076fea33c937"} Jan 31 10:11:05 crc kubenswrapper[4992]: I0131 10:11:05.498591 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g9pd2" podStartSLOduration=3.057084977 podStartE2EDuration="5.498569437s" podCreationTimestamp="2026-01-31 10:11:00 +0000 UTC" firstStartedPulling="2026-01-31 10:11:02.443795575 +0000 UTC m=+2758.415187602" lastFinishedPulling="2026-01-31 10:11:04.885280035 +0000 UTC m=+2760.856672062" observedRunningTime="2026-01-31 10:11:05.493151721 +0000 UTC m=+2761.464543738" watchObservedRunningTime="2026-01-31 10:11:05.498569437 +0000 UTC m=+2761.469961434" Jan 31 10:11:11 crc kubenswrapper[4992]: I0131 10:11:11.144106 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:11 crc kubenswrapper[4992]: I0131 10:11:11.144757 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:11 crc kubenswrapper[4992]: I0131 10:11:11.205638 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:11 crc kubenswrapper[4992]: I0131 10:11:11.571599 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:11 crc kubenswrapper[4992]: I0131 10:11:11.618135 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g9pd2"] Jan 31 10:11:13 crc kubenswrapper[4992]: I0131 10:11:13.546200 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g9pd2" podUID="c225dcf5-b586-430a-9793-26946fe6e312" containerName="registry-server" containerID="cri-o://46b2192f8256af9f8eb8e3e5c0cb532f9771c64c9202a55eb949076fea33c937" gracePeriod=2 Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.112760 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.121461 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c225dcf5-b586-430a-9793-26946fe6e312-catalog-content\") pod \"c225dcf5-b586-430a-9793-26946fe6e312\" (UID: \"c225dcf5-b586-430a-9793-26946fe6e312\") " Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.121518 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzz9z\" (UniqueName: \"kubernetes.io/projected/c225dcf5-b586-430a-9793-26946fe6e312-kube-api-access-xzz9z\") pod \"c225dcf5-b586-430a-9793-26946fe6e312\" (UID: \"c225dcf5-b586-430a-9793-26946fe6e312\") " Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.121568 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c225dcf5-b586-430a-9793-26946fe6e312-utilities\") pod \"c225dcf5-b586-430a-9793-26946fe6e312\" (UID: \"c225dcf5-b586-430a-9793-26946fe6e312\") " Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.122535 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c225dcf5-b586-430a-9793-26946fe6e312-utilities" (OuterVolumeSpecName: "utilities") pod "c225dcf5-b586-430a-9793-26946fe6e312" (UID: "c225dcf5-b586-430a-9793-26946fe6e312"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.128693 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c225dcf5-b586-430a-9793-26946fe6e312-kube-api-access-xzz9z" (OuterVolumeSpecName: "kube-api-access-xzz9z") pod "c225dcf5-b586-430a-9793-26946fe6e312" (UID: "c225dcf5-b586-430a-9793-26946fe6e312"). InnerVolumeSpecName "kube-api-access-xzz9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.224198 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzz9z\" (UniqueName: \"kubernetes.io/projected/c225dcf5-b586-430a-9793-26946fe6e312-kube-api-access-xzz9z\") on node \"crc\" DevicePath \"\"" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.224228 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c225dcf5-b586-430a-9793-26946fe6e312-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.563985 4992 generic.go:334] "Generic (PLEG): container finished" podID="c225dcf5-b586-430a-9793-26946fe6e312" containerID="46b2192f8256af9f8eb8e3e5c0cb532f9771c64c9202a55eb949076fea33c937" exitCode=0 Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.564040 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9pd2" event={"ID":"c225dcf5-b586-430a-9793-26946fe6e312","Type":"ContainerDied","Data":"46b2192f8256af9f8eb8e3e5c0cb532f9771c64c9202a55eb949076fea33c937"} Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.564063 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g9pd2" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.564082 4992 scope.go:117] "RemoveContainer" containerID="46b2192f8256af9f8eb8e3e5c0cb532f9771c64c9202a55eb949076fea33c937" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.564070 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g9pd2" event={"ID":"c225dcf5-b586-430a-9793-26946fe6e312","Type":"ContainerDied","Data":"6a4fae8f4935707f9bd0e45f0001fb913e20cd18ff294e851e12bb7e3d091b38"} Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.582596 4992 scope.go:117] "RemoveContainer" containerID="6be0607112f63ee4c3893517a211c8cb0108e26e5eb856e442678d90bf9495ad" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.585015 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c225dcf5-b586-430a-9793-26946fe6e312-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c225dcf5-b586-430a-9793-26946fe6e312" (UID: "c225dcf5-b586-430a-9793-26946fe6e312"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.601611 4992 scope.go:117] "RemoveContainer" containerID="8b161bf54e3d5858852323dfef5a98ca121072099a861d614c83f0b332939b2b" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.629331 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c225dcf5-b586-430a-9793-26946fe6e312-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.651900 4992 scope.go:117] "RemoveContainer" containerID="46b2192f8256af9f8eb8e3e5c0cb532f9771c64c9202a55eb949076fea33c937" Jan 31 10:11:14 crc kubenswrapper[4992]: E0131 10:11:14.652463 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46b2192f8256af9f8eb8e3e5c0cb532f9771c64c9202a55eb949076fea33c937\": container with ID starting with 46b2192f8256af9f8eb8e3e5c0cb532f9771c64c9202a55eb949076fea33c937 not found: ID does not exist" containerID="46b2192f8256af9f8eb8e3e5c0cb532f9771c64c9202a55eb949076fea33c937" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.652494 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46b2192f8256af9f8eb8e3e5c0cb532f9771c64c9202a55eb949076fea33c937"} err="failed to get container status \"46b2192f8256af9f8eb8e3e5c0cb532f9771c64c9202a55eb949076fea33c937\": rpc error: code = NotFound desc = could not find container \"46b2192f8256af9f8eb8e3e5c0cb532f9771c64c9202a55eb949076fea33c937\": container with ID starting with 46b2192f8256af9f8eb8e3e5c0cb532f9771c64c9202a55eb949076fea33c937 not found: ID does not exist" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.652522 4992 scope.go:117] "RemoveContainer" containerID="6be0607112f63ee4c3893517a211c8cb0108e26e5eb856e442678d90bf9495ad" Jan 31 10:11:14 crc kubenswrapper[4992]: E0131 10:11:14.652877 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6be0607112f63ee4c3893517a211c8cb0108e26e5eb856e442678d90bf9495ad\": container with ID starting with 6be0607112f63ee4c3893517a211c8cb0108e26e5eb856e442678d90bf9495ad not found: ID does not exist" containerID="6be0607112f63ee4c3893517a211c8cb0108e26e5eb856e442678d90bf9495ad" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.652904 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6be0607112f63ee4c3893517a211c8cb0108e26e5eb856e442678d90bf9495ad"} err="failed to get container status \"6be0607112f63ee4c3893517a211c8cb0108e26e5eb856e442678d90bf9495ad\": rpc error: code = NotFound desc = could not find container \"6be0607112f63ee4c3893517a211c8cb0108e26e5eb856e442678d90bf9495ad\": container with ID starting with 6be0607112f63ee4c3893517a211c8cb0108e26e5eb856e442678d90bf9495ad not found: ID does not exist" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.652920 4992 scope.go:117] "RemoveContainer" containerID="8b161bf54e3d5858852323dfef5a98ca121072099a861d614c83f0b332939b2b" Jan 31 10:11:14 crc kubenswrapper[4992]: E0131 10:11:14.653319 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b161bf54e3d5858852323dfef5a98ca121072099a861d614c83f0b332939b2b\": container with ID starting with 8b161bf54e3d5858852323dfef5a98ca121072099a861d614c83f0b332939b2b not found: ID does not exist" containerID="8b161bf54e3d5858852323dfef5a98ca121072099a861d614c83f0b332939b2b" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.653360 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b161bf54e3d5858852323dfef5a98ca121072099a861d614c83f0b332939b2b"} err="failed to get container status \"8b161bf54e3d5858852323dfef5a98ca121072099a861d614c83f0b332939b2b\": rpc error: code = NotFound desc = could not find container \"8b161bf54e3d5858852323dfef5a98ca121072099a861d614c83f0b332939b2b\": container with ID starting with 8b161bf54e3d5858852323dfef5a98ca121072099a861d614c83f0b332939b2b not found: ID does not exist" Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.896496 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g9pd2"] Jan 31 10:11:14 crc kubenswrapper[4992]: I0131 10:11:14.905673 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g9pd2"] Jan 31 10:11:15 crc kubenswrapper[4992]: I0131 10:11:15.199001 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c225dcf5-b586-430a-9793-26946fe6e312" path="/var/lib/kubelet/pods/c225dcf5-b586-430a-9793-26946fe6e312/volumes" Jan 31 10:11:15 crc kubenswrapper[4992]: I0131 10:11:15.301171 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:11:15 crc kubenswrapper[4992]: I0131 10:11:15.301222 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:11:38 crc kubenswrapper[4992]: I0131 10:11:38.770909 4992 generic.go:334] "Generic (PLEG): container finished" podID="a64d87fc-267b-4505-a807-aa020492685c" containerID="2f8e0f335b394311cce7f0072d230e087b923b96275b671368e6527df7e1632d" exitCode=0 Jan 31 10:11:38 crc kubenswrapper[4992]: I0131 10:11:38.771181 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" event={"ID":"a64d87fc-267b-4505-a807-aa020492685c","Type":"ContainerDied","Data":"2f8e0f335b394311cce7f0072d230e087b923b96275b671368e6527df7e1632d"} Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.241351 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.435839 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-libvirt-combined-ca-bundle\") pod \"a64d87fc-267b-4505-a807-aa020492685c\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.435969 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-ceph\") pod \"a64d87fc-267b-4505-a807-aa020492685c\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.436045 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-ssh-key-openstack-edpm-ipam\") pod \"a64d87fc-267b-4505-a807-aa020492685c\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.436118 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-libvirt-secret-0\") pod \"a64d87fc-267b-4505-a807-aa020492685c\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.436156 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-inventory\") pod \"a64d87fc-267b-4505-a807-aa020492685c\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.436239 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttfjl\" (UniqueName: \"kubernetes.io/projected/a64d87fc-267b-4505-a807-aa020492685c-kube-api-access-ttfjl\") pod \"a64d87fc-267b-4505-a807-aa020492685c\" (UID: \"a64d87fc-267b-4505-a807-aa020492685c\") " Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.443738 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-ceph" (OuterVolumeSpecName: "ceph") pod "a64d87fc-267b-4505-a807-aa020492685c" (UID: "a64d87fc-267b-4505-a807-aa020492685c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.444024 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a64d87fc-267b-4505-a807-aa020492685c" (UID: "a64d87fc-267b-4505-a807-aa020492685c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.444558 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64d87fc-267b-4505-a807-aa020492685c-kube-api-access-ttfjl" (OuterVolumeSpecName: "kube-api-access-ttfjl") pod "a64d87fc-267b-4505-a807-aa020492685c" (UID: "a64d87fc-267b-4505-a807-aa020492685c"). InnerVolumeSpecName "kube-api-access-ttfjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.467157 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-inventory" (OuterVolumeSpecName: "inventory") pod "a64d87fc-267b-4505-a807-aa020492685c" (UID: "a64d87fc-267b-4505-a807-aa020492685c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.469547 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a64d87fc-267b-4505-a807-aa020492685c" (UID: "a64d87fc-267b-4505-a807-aa020492685c"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.472745 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a64d87fc-267b-4505-a807-aa020492685c" (UID: "a64d87fc-267b-4505-a807-aa020492685c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.538301 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.538346 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.538360 4992 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.538373 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.538384 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttfjl\" (UniqueName: \"kubernetes.io/projected/a64d87fc-267b-4505-a807-aa020492685c-kube-api-access-ttfjl\") on node \"crc\" DevicePath \"\"" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.538395 4992 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64d87fc-267b-4505-a807-aa020492685c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.792929 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" event={"ID":"a64d87fc-267b-4505-a807-aa020492685c","Type":"ContainerDied","Data":"ade4d8ded6a1dc860ccf98b6bb94d9d805d83b3f16eba2c18a2db07997be2bcb"} Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.792971 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ade4d8ded6a1dc860ccf98b6bb94d9d805d83b3f16eba2c18a2db07997be2bcb" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.793028 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.945214 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z"] Jan 31 10:11:40 crc kubenswrapper[4992]: E0131 10:11:40.950538 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c225dcf5-b586-430a-9793-26946fe6e312" containerName="registry-server" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.950584 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c225dcf5-b586-430a-9793-26946fe6e312" containerName="registry-server" Jan 31 10:11:40 crc kubenswrapper[4992]: E0131 10:11:40.950645 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64d87fc-267b-4505-a807-aa020492685c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.950659 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64d87fc-267b-4505-a807-aa020492685c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 31 10:11:40 crc kubenswrapper[4992]: E0131 10:11:40.950680 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c225dcf5-b586-430a-9793-26946fe6e312" containerName="extract-utilities" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.950692 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c225dcf5-b586-430a-9793-26946fe6e312" containerName="extract-utilities" Jan 31 10:11:40 crc kubenswrapper[4992]: E0131 10:11:40.950717 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c225dcf5-b586-430a-9793-26946fe6e312" containerName="extract-content" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.950728 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c225dcf5-b586-430a-9793-26946fe6e312" containerName="extract-content" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.951020 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64d87fc-267b-4505-a807-aa020492685c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.951068 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="c225dcf5-b586-430a-9793-26946fe6e312" containerName="registry-server" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.952044 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.955075 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.955258 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.955641 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.956540 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.956570 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.956737 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-94q4p" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.956589 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.956852 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.957437 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 31 10:11:40 crc kubenswrapper[4992]: I0131 10:11:40.958186 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z"] Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.051048 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.051141 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.051208 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.051278 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.051575 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.051691 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln9hg\" (UniqueName: \"kubernetes.io/projected/b8de9d76-0d2b-4006-b646-c6065aafc642-kube-api-access-ln9hg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.051893 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.051939 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b8de9d76-0d2b-4006-b646-c6065aafc642-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.051964 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.052055 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.052089 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.154295 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.154415 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.154519 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln9hg\" (UniqueName: \"kubernetes.io/projected/b8de9d76-0d2b-4006-b646-c6065aafc642-kube-api-access-ln9hg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.154574 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.154610 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b8de9d76-0d2b-4006-b646-c6065aafc642-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.154634 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.154684 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.154714 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.154743 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.154803 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.154856 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.156592 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.157259 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b8de9d76-0d2b-4006-b646-c6065aafc642-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.162399 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.162883 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.162953 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.164488 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.165196 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.165375 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.171711 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.175923 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln9hg\" (UniqueName: \"kubernetes.io/projected/b8de9d76-0d2b-4006-b646-c6065aafc642-kube-api-access-ln9hg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.182964 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.270517 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:11:41 crc kubenswrapper[4992]: I0131 10:11:41.856446 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z"] Jan 31 10:11:42 crc kubenswrapper[4992]: I0131 10:11:42.812895 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" event={"ID":"b8de9d76-0d2b-4006-b646-c6065aafc642","Type":"ContainerStarted","Data":"9387ccbbb678744413d0e9e3945e2d2d013748b3eb4adde9fb63485bf29b842f"} Jan 31 10:11:42 crc kubenswrapper[4992]: I0131 10:11:42.813301 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" event={"ID":"b8de9d76-0d2b-4006-b646-c6065aafc642","Type":"ContainerStarted","Data":"badcff0a05e45dddd6280c9da17d5d5937fa7ea0055634dacebfa57885195b30"} Jan 31 10:11:42 crc kubenswrapper[4992]: I0131 10:11:42.846520 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" podStartSLOduration=2.36715746 podStartE2EDuration="2.846490966s" podCreationTimestamp="2026-01-31 10:11:40 +0000 UTC" firstStartedPulling="2026-01-31 10:11:41.863607366 +0000 UTC m=+2797.834999373" lastFinishedPulling="2026-01-31 10:11:42.342940902 +0000 UTC m=+2798.314332879" observedRunningTime="2026-01-31 10:11:42.835026226 +0000 UTC m=+2798.806418213" watchObservedRunningTime="2026-01-31 10:11:42.846490966 +0000 UTC m=+2798.817882963" Jan 31 10:11:45 crc kubenswrapper[4992]: I0131 10:11:45.301223 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:11:45 crc kubenswrapper[4992]: I0131 10:11:45.301549 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:11:45 crc kubenswrapper[4992]: I0131 10:11:45.301593 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 10:11:45 crc kubenswrapper[4992]: I0131 10:11:45.302247 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dcbcc87f018c90f071ceba0faad9cc5c81ec174891a9b84e3c799f853950d984"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 10:11:45 crc kubenswrapper[4992]: I0131 10:11:45.302303 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://dcbcc87f018c90f071ceba0faad9cc5c81ec174891a9b84e3c799f853950d984" gracePeriod=600 Jan 31 10:11:45 crc kubenswrapper[4992]: I0131 10:11:45.848245 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="dcbcc87f018c90f071ceba0faad9cc5c81ec174891a9b84e3c799f853950d984" exitCode=0 Jan 31 10:11:45 crc kubenswrapper[4992]: I0131 10:11:45.848301 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"dcbcc87f018c90f071ceba0faad9cc5c81ec174891a9b84e3c799f853950d984"} Jan 31 10:11:45 crc kubenswrapper[4992]: I0131 10:11:45.848333 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf"} Jan 31 10:11:45 crc kubenswrapper[4992]: I0131 10:11:45.848353 4992 scope.go:117] "RemoveContainer" containerID="1340d63d6735f01388099cb0411f85d0362d92d1df1b097a83a8fe26a2dc7c81" Jan 31 10:12:35 crc kubenswrapper[4992]: I0131 10:12:35.467763 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9ndxz"] Jan 31 10:12:35 crc kubenswrapper[4992]: I0131 10:12:35.474714 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:35 crc kubenswrapper[4992]: I0131 10:12:35.479256 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9ndxz"] Jan 31 10:12:35 crc kubenswrapper[4992]: I0131 10:12:35.635917 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d890eca-6653-49b0-ae1e-fe0488bb38ea-catalog-content\") pod \"certified-operators-9ndxz\" (UID: \"5d890eca-6653-49b0-ae1e-fe0488bb38ea\") " pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:35 crc kubenswrapper[4992]: I0131 10:12:35.636045 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb4lp\" (UniqueName: \"kubernetes.io/projected/5d890eca-6653-49b0-ae1e-fe0488bb38ea-kube-api-access-jb4lp\") pod \"certified-operators-9ndxz\" (UID: \"5d890eca-6653-49b0-ae1e-fe0488bb38ea\") " pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:35 crc kubenswrapper[4992]: I0131 10:12:35.636196 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d890eca-6653-49b0-ae1e-fe0488bb38ea-utilities\") pod \"certified-operators-9ndxz\" (UID: \"5d890eca-6653-49b0-ae1e-fe0488bb38ea\") " pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:35 crc kubenswrapper[4992]: I0131 10:12:35.738558 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb4lp\" (UniqueName: \"kubernetes.io/projected/5d890eca-6653-49b0-ae1e-fe0488bb38ea-kube-api-access-jb4lp\") pod \"certified-operators-9ndxz\" (UID: \"5d890eca-6653-49b0-ae1e-fe0488bb38ea\") " pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:35 crc kubenswrapper[4992]: I0131 10:12:35.738633 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d890eca-6653-49b0-ae1e-fe0488bb38ea-utilities\") pod \"certified-operators-9ndxz\" (UID: \"5d890eca-6653-49b0-ae1e-fe0488bb38ea\") " pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:35 crc kubenswrapper[4992]: I0131 10:12:35.738732 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d890eca-6653-49b0-ae1e-fe0488bb38ea-catalog-content\") pod \"certified-operators-9ndxz\" (UID: \"5d890eca-6653-49b0-ae1e-fe0488bb38ea\") " pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:35 crc kubenswrapper[4992]: I0131 10:12:35.739150 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d890eca-6653-49b0-ae1e-fe0488bb38ea-utilities\") pod \"certified-operators-9ndxz\" (UID: \"5d890eca-6653-49b0-ae1e-fe0488bb38ea\") " pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:35 crc kubenswrapper[4992]: I0131 10:12:35.739177 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d890eca-6653-49b0-ae1e-fe0488bb38ea-catalog-content\") pod \"certified-operators-9ndxz\" (UID: \"5d890eca-6653-49b0-ae1e-fe0488bb38ea\") " pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:35 crc kubenswrapper[4992]: I0131 10:12:35.771347 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb4lp\" (UniqueName: \"kubernetes.io/projected/5d890eca-6653-49b0-ae1e-fe0488bb38ea-kube-api-access-jb4lp\") pod \"certified-operators-9ndxz\" (UID: \"5d890eca-6653-49b0-ae1e-fe0488bb38ea\") " pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:35 crc kubenswrapper[4992]: I0131 10:12:35.828004 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:36 crc kubenswrapper[4992]: I0131 10:12:36.319029 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9ndxz"] Jan 31 10:12:37 crc kubenswrapper[4992]: I0131 10:12:37.296165 4992 generic.go:334] "Generic (PLEG): container finished" podID="5d890eca-6653-49b0-ae1e-fe0488bb38ea" containerID="afe4734ccdc8d917940f67e52436a826f96e78868c3acc9837aabb65e6007c2f" exitCode=0 Jan 31 10:12:37 crc kubenswrapper[4992]: I0131 10:12:37.296353 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ndxz" event={"ID":"5d890eca-6653-49b0-ae1e-fe0488bb38ea","Type":"ContainerDied","Data":"afe4734ccdc8d917940f67e52436a826f96e78868c3acc9837aabb65e6007c2f"} Jan 31 10:12:37 crc kubenswrapper[4992]: I0131 10:12:37.296469 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ndxz" event={"ID":"5d890eca-6653-49b0-ae1e-fe0488bb38ea","Type":"ContainerStarted","Data":"4adf66882f0b3440f57b1af6fa6860fce52e1e9eaf901231dd65fb8fae6b1e7e"} Jan 31 10:12:38 crc kubenswrapper[4992]: I0131 10:12:38.310263 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ndxz" event={"ID":"5d890eca-6653-49b0-ae1e-fe0488bb38ea","Type":"ContainerStarted","Data":"43ca21b46e55ff01601114b06793e9f03b13a8baa6e10ad83100e44b2d412a38"} Jan 31 10:12:39 crc kubenswrapper[4992]: I0131 10:12:39.319195 4992 generic.go:334] "Generic (PLEG): container finished" podID="5d890eca-6653-49b0-ae1e-fe0488bb38ea" containerID="43ca21b46e55ff01601114b06793e9f03b13a8baa6e10ad83100e44b2d412a38" exitCode=0 Jan 31 10:12:39 crc kubenswrapper[4992]: I0131 10:12:39.319240 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ndxz" event={"ID":"5d890eca-6653-49b0-ae1e-fe0488bb38ea","Type":"ContainerDied","Data":"43ca21b46e55ff01601114b06793e9f03b13a8baa6e10ad83100e44b2d412a38"} Jan 31 10:12:40 crc kubenswrapper[4992]: I0131 10:12:40.328665 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ndxz" event={"ID":"5d890eca-6653-49b0-ae1e-fe0488bb38ea","Type":"ContainerStarted","Data":"8d2cf2a835bbf3791ee8173e9095691056412aca49c79036fe2804d0d257dd67"} Jan 31 10:12:40 crc kubenswrapper[4992]: I0131 10:12:40.349311 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9ndxz" podStartSLOduration=2.899707775 podStartE2EDuration="5.349292999s" podCreationTimestamp="2026-01-31 10:12:35 +0000 UTC" firstStartedPulling="2026-01-31 10:12:37.300591242 +0000 UTC m=+2853.271983229" lastFinishedPulling="2026-01-31 10:12:39.750176466 +0000 UTC m=+2855.721568453" observedRunningTime="2026-01-31 10:12:40.345578802 +0000 UTC m=+2856.316970799" watchObservedRunningTime="2026-01-31 10:12:40.349292999 +0000 UTC m=+2856.320684986" Jan 31 10:12:45 crc kubenswrapper[4992]: I0131 10:12:45.828728 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:45 crc kubenswrapper[4992]: I0131 10:12:45.831074 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:45 crc kubenswrapper[4992]: I0131 10:12:45.874616 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:46 crc kubenswrapper[4992]: I0131 10:12:46.450464 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:46 crc kubenswrapper[4992]: I0131 10:12:46.507792 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9ndxz"] Jan 31 10:12:48 crc kubenswrapper[4992]: I0131 10:12:48.404083 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9ndxz" podUID="5d890eca-6653-49b0-ae1e-fe0488bb38ea" containerName="registry-server" containerID="cri-o://8d2cf2a835bbf3791ee8173e9095691056412aca49c79036fe2804d0d257dd67" gracePeriod=2 Jan 31 10:12:48 crc kubenswrapper[4992]: I0131 10:12:48.899324 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.102493 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d890eca-6653-49b0-ae1e-fe0488bb38ea-catalog-content\") pod \"5d890eca-6653-49b0-ae1e-fe0488bb38ea\" (UID: \"5d890eca-6653-49b0-ae1e-fe0488bb38ea\") " Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.102550 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d890eca-6653-49b0-ae1e-fe0488bb38ea-utilities\") pod \"5d890eca-6653-49b0-ae1e-fe0488bb38ea\" (UID: \"5d890eca-6653-49b0-ae1e-fe0488bb38ea\") " Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.102596 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb4lp\" (UniqueName: \"kubernetes.io/projected/5d890eca-6653-49b0-ae1e-fe0488bb38ea-kube-api-access-jb4lp\") pod \"5d890eca-6653-49b0-ae1e-fe0488bb38ea\" (UID: \"5d890eca-6653-49b0-ae1e-fe0488bb38ea\") " Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.103641 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d890eca-6653-49b0-ae1e-fe0488bb38ea-utilities" (OuterVolumeSpecName: "utilities") pod "5d890eca-6653-49b0-ae1e-fe0488bb38ea" (UID: "5d890eca-6653-49b0-ae1e-fe0488bb38ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.110436 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d890eca-6653-49b0-ae1e-fe0488bb38ea-kube-api-access-jb4lp" (OuterVolumeSpecName: "kube-api-access-jb4lp") pod "5d890eca-6653-49b0-ae1e-fe0488bb38ea" (UID: "5d890eca-6653-49b0-ae1e-fe0488bb38ea"). InnerVolumeSpecName "kube-api-access-jb4lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.206122 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d890eca-6653-49b0-ae1e-fe0488bb38ea-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.206190 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb4lp\" (UniqueName: \"kubernetes.io/projected/5d890eca-6653-49b0-ae1e-fe0488bb38ea-kube-api-access-jb4lp\") on node \"crc\" DevicePath \"\"" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.281004 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d890eca-6653-49b0-ae1e-fe0488bb38ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d890eca-6653-49b0-ae1e-fe0488bb38ea" (UID: "5d890eca-6653-49b0-ae1e-fe0488bb38ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.307208 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d890eca-6653-49b0-ae1e-fe0488bb38ea-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.415913 4992 generic.go:334] "Generic (PLEG): container finished" podID="5d890eca-6653-49b0-ae1e-fe0488bb38ea" containerID="8d2cf2a835bbf3791ee8173e9095691056412aca49c79036fe2804d0d257dd67" exitCode=0 Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.415958 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ndxz" event={"ID":"5d890eca-6653-49b0-ae1e-fe0488bb38ea","Type":"ContainerDied","Data":"8d2cf2a835bbf3791ee8173e9095691056412aca49c79036fe2804d0d257dd67"} Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.415992 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ndxz" event={"ID":"5d890eca-6653-49b0-ae1e-fe0488bb38ea","Type":"ContainerDied","Data":"4adf66882f0b3440f57b1af6fa6860fce52e1e9eaf901231dd65fb8fae6b1e7e"} Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.415997 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ndxz" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.416013 4992 scope.go:117] "RemoveContainer" containerID="8d2cf2a835bbf3791ee8173e9095691056412aca49c79036fe2804d0d257dd67" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.463870 4992 scope.go:117] "RemoveContainer" containerID="43ca21b46e55ff01601114b06793e9f03b13a8baa6e10ad83100e44b2d412a38" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.466219 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9ndxz"] Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.479296 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9ndxz"] Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.487986 4992 scope.go:117] "RemoveContainer" containerID="afe4734ccdc8d917940f67e52436a826f96e78868c3acc9837aabb65e6007c2f" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.533259 4992 scope.go:117] "RemoveContainer" containerID="8d2cf2a835bbf3791ee8173e9095691056412aca49c79036fe2804d0d257dd67" Jan 31 10:12:49 crc kubenswrapper[4992]: E0131 10:12:49.533658 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d2cf2a835bbf3791ee8173e9095691056412aca49c79036fe2804d0d257dd67\": container with ID starting with 8d2cf2a835bbf3791ee8173e9095691056412aca49c79036fe2804d0d257dd67 not found: ID does not exist" containerID="8d2cf2a835bbf3791ee8173e9095691056412aca49c79036fe2804d0d257dd67" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.533696 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d2cf2a835bbf3791ee8173e9095691056412aca49c79036fe2804d0d257dd67"} err="failed to get container status \"8d2cf2a835bbf3791ee8173e9095691056412aca49c79036fe2804d0d257dd67\": rpc error: code = NotFound desc = could not find container \"8d2cf2a835bbf3791ee8173e9095691056412aca49c79036fe2804d0d257dd67\": container with ID starting with 8d2cf2a835bbf3791ee8173e9095691056412aca49c79036fe2804d0d257dd67 not found: ID does not exist" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.533742 4992 scope.go:117] "RemoveContainer" containerID="43ca21b46e55ff01601114b06793e9f03b13a8baa6e10ad83100e44b2d412a38" Jan 31 10:12:49 crc kubenswrapper[4992]: E0131 10:12:49.533946 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ca21b46e55ff01601114b06793e9f03b13a8baa6e10ad83100e44b2d412a38\": container with ID starting with 43ca21b46e55ff01601114b06793e9f03b13a8baa6e10ad83100e44b2d412a38 not found: ID does not exist" containerID="43ca21b46e55ff01601114b06793e9f03b13a8baa6e10ad83100e44b2d412a38" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.533995 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ca21b46e55ff01601114b06793e9f03b13a8baa6e10ad83100e44b2d412a38"} err="failed to get container status \"43ca21b46e55ff01601114b06793e9f03b13a8baa6e10ad83100e44b2d412a38\": rpc error: code = NotFound desc = could not find container \"43ca21b46e55ff01601114b06793e9f03b13a8baa6e10ad83100e44b2d412a38\": container with ID starting with 43ca21b46e55ff01601114b06793e9f03b13a8baa6e10ad83100e44b2d412a38 not found: ID does not exist" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.534013 4992 scope.go:117] "RemoveContainer" containerID="afe4734ccdc8d917940f67e52436a826f96e78868c3acc9837aabb65e6007c2f" Jan 31 10:12:49 crc kubenswrapper[4992]: E0131 10:12:49.534271 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afe4734ccdc8d917940f67e52436a826f96e78868c3acc9837aabb65e6007c2f\": container with ID starting with afe4734ccdc8d917940f67e52436a826f96e78868c3acc9837aabb65e6007c2f not found: ID does not exist" containerID="afe4734ccdc8d917940f67e52436a826f96e78868c3acc9837aabb65e6007c2f" Jan 31 10:12:49 crc kubenswrapper[4992]: I0131 10:12:49.534297 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afe4734ccdc8d917940f67e52436a826f96e78868c3acc9837aabb65e6007c2f"} err="failed to get container status \"afe4734ccdc8d917940f67e52436a826f96e78868c3acc9837aabb65e6007c2f\": rpc error: code = NotFound desc = could not find container \"afe4734ccdc8d917940f67e52436a826f96e78868c3acc9837aabb65e6007c2f\": container with ID starting with afe4734ccdc8d917940f67e52436a826f96e78868c3acc9837aabb65e6007c2f not found: ID does not exist" Jan 31 10:12:51 crc kubenswrapper[4992]: I0131 10:12:51.194413 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d890eca-6653-49b0-ae1e-fe0488bb38ea" path="/var/lib/kubelet/pods/5d890eca-6653-49b0-ae1e-fe0488bb38ea/volumes" Jan 31 10:13:45 crc kubenswrapper[4992]: I0131 10:13:45.301057 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:13:45 crc kubenswrapper[4992]: I0131 10:13:45.301671 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:13:57 crc kubenswrapper[4992]: I0131 10:13:57.045901 4992 generic.go:334] "Generic (PLEG): container finished" podID="b8de9d76-0d2b-4006-b646-c6065aafc642" containerID="9387ccbbb678744413d0e9e3945e2d2d013748b3eb4adde9fb63485bf29b842f" exitCode=0 Jan 31 10:13:57 crc kubenswrapper[4992]: I0131 10:13:57.045995 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" event={"ID":"b8de9d76-0d2b-4006-b646-c6065aafc642","Type":"ContainerDied","Data":"9387ccbbb678744413d0e9e3945e2d2d013748b3eb4adde9fb63485bf29b842f"} Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.519227 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.669124 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-extra-config-0\") pod \"b8de9d76-0d2b-4006-b646-c6065aafc642\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.669196 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b8de9d76-0d2b-4006-b646-c6065aafc642-ceph-nova-0\") pod \"b8de9d76-0d2b-4006-b646-c6065aafc642\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.669221 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-ceph\") pod \"b8de9d76-0d2b-4006-b646-c6065aafc642\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.669260 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-cell1-compute-config-0\") pod \"b8de9d76-0d2b-4006-b646-c6065aafc642\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.670122 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-custom-ceph-combined-ca-bundle\") pod \"b8de9d76-0d2b-4006-b646-c6065aafc642\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.670189 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-migration-ssh-key-1\") pod \"b8de9d76-0d2b-4006-b646-c6065aafc642\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.670270 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-cell1-compute-config-1\") pod \"b8de9d76-0d2b-4006-b646-c6065aafc642\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.670290 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-inventory\") pod \"b8de9d76-0d2b-4006-b646-c6065aafc642\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.670314 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln9hg\" (UniqueName: \"kubernetes.io/projected/b8de9d76-0d2b-4006-b646-c6065aafc642-kube-api-access-ln9hg\") pod \"b8de9d76-0d2b-4006-b646-c6065aafc642\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.670349 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-ssh-key-openstack-edpm-ipam\") pod \"b8de9d76-0d2b-4006-b646-c6065aafc642\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.670413 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-migration-ssh-key-0\") pod \"b8de9d76-0d2b-4006-b646-c6065aafc642\" (UID: \"b8de9d76-0d2b-4006-b646-c6065aafc642\") " Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.675268 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "b8de9d76-0d2b-4006-b646-c6065aafc642" (UID: "b8de9d76-0d2b-4006-b646-c6065aafc642"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.675796 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8de9d76-0d2b-4006-b646-c6065aafc642-kube-api-access-ln9hg" (OuterVolumeSpecName: "kube-api-access-ln9hg") pod "b8de9d76-0d2b-4006-b646-c6065aafc642" (UID: "b8de9d76-0d2b-4006-b646-c6065aafc642"). InnerVolumeSpecName "kube-api-access-ln9hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.676511 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-ceph" (OuterVolumeSpecName: "ceph") pod "b8de9d76-0d2b-4006-b646-c6065aafc642" (UID: "b8de9d76-0d2b-4006-b646-c6065aafc642"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.694780 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8de9d76-0d2b-4006-b646-c6065aafc642-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "b8de9d76-0d2b-4006-b646-c6065aafc642" (UID: "b8de9d76-0d2b-4006-b646-c6065aafc642"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.696781 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "b8de9d76-0d2b-4006-b646-c6065aafc642" (UID: "b8de9d76-0d2b-4006-b646-c6065aafc642"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.698872 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b8de9d76-0d2b-4006-b646-c6065aafc642" (UID: "b8de9d76-0d2b-4006-b646-c6065aafc642"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.701313 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "b8de9d76-0d2b-4006-b646-c6065aafc642" (UID: "b8de9d76-0d2b-4006-b646-c6065aafc642"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.705368 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "b8de9d76-0d2b-4006-b646-c6065aafc642" (UID: "b8de9d76-0d2b-4006-b646-c6065aafc642"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.705781 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "b8de9d76-0d2b-4006-b646-c6065aafc642" (UID: "b8de9d76-0d2b-4006-b646-c6065aafc642"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.710052 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "b8de9d76-0d2b-4006-b646-c6065aafc642" (UID: "b8de9d76-0d2b-4006-b646-c6065aafc642"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.720385 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-inventory" (OuterVolumeSpecName: "inventory") pod "b8de9d76-0d2b-4006-b646-c6065aafc642" (UID: "b8de9d76-0d2b-4006-b646-c6065aafc642"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.772915 4992 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.773079 4992 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.773185 4992 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b8de9d76-0d2b-4006-b646-c6065aafc642-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.773284 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.773373 4992 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.773479 4992 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.773593 4992 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.773689 4992 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.773785 4992 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-inventory\") on node \"crc\" DevicePath \"\"" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.773884 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln9hg\" (UniqueName: \"kubernetes.io/projected/b8de9d76-0d2b-4006-b646-c6065aafc642-kube-api-access-ln9hg\") on node \"crc\" DevicePath \"\"" Jan 31 10:13:58 crc kubenswrapper[4992]: I0131 10:13:58.773988 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b8de9d76-0d2b-4006-b646-c6065aafc642-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:13:59 crc kubenswrapper[4992]: I0131 10:13:59.071977 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" event={"ID":"b8de9d76-0d2b-4006-b646-c6065aafc642","Type":"ContainerDied","Data":"badcff0a05e45dddd6280c9da17d5d5937fa7ea0055634dacebfa57885195b30"} Jan 31 10:13:59 crc kubenswrapper[4992]: I0131 10:13:59.072019 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="badcff0a05e45dddd6280c9da17d5d5937fa7ea0055634dacebfa57885195b30" Jan 31 10:13:59 crc kubenswrapper[4992]: I0131 10:13:59.072389 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.432739 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 31 10:14:12 crc kubenswrapper[4992]: E0131 10:14:12.433704 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d890eca-6653-49b0-ae1e-fe0488bb38ea" containerName="registry-server" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.433722 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d890eca-6653-49b0-ae1e-fe0488bb38ea" containerName="registry-server" Jan 31 10:14:12 crc kubenswrapper[4992]: E0131 10:14:12.433743 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d890eca-6653-49b0-ae1e-fe0488bb38ea" containerName="extract-content" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.433753 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d890eca-6653-49b0-ae1e-fe0488bb38ea" containerName="extract-content" Jan 31 10:14:12 crc kubenswrapper[4992]: E0131 10:14:12.433765 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d890eca-6653-49b0-ae1e-fe0488bb38ea" containerName="extract-utilities" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.433772 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d890eca-6653-49b0-ae1e-fe0488bb38ea" containerName="extract-utilities" Jan 31 10:14:12 crc kubenswrapper[4992]: E0131 10:14:12.433796 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8de9d76-0d2b-4006-b646-c6065aafc642" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.433806 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8de9d76-0d2b-4006-b646-c6065aafc642" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.434025 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d890eca-6653-49b0-ae1e-fe0488bb38ea" containerName="registry-server" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.434047 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8de9d76-0d2b-4006-b646-c6065aafc642" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.435178 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.436947 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.438289 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.449489 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.487968 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.488210 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-sys\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.488316 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a08173f-07a3-4a7a-b124-b3a98c1d0749-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.488340 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.488384 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-dev\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.488492 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.488545 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.488705 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-run\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.488768 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a08173f-07a3-4a7a-b124-b3a98c1d0749-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.488811 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpkdl\" (UniqueName: \"kubernetes.io/projected/4a08173f-07a3-4a7a-b124-b3a98c1d0749-kube-api-access-kpkdl\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.491537 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.491642 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a08173f-07a3-4a7a-b124-b3a98c1d0749-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.491670 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.491692 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.491753 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a08173f-07a3-4a7a-b124-b3a98c1d0749-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.491801 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4a08173f-07a3-4a7a-b124-b3a98c1d0749-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.499099 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.502162 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.508189 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.519140 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.593064 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.593440 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a08173f-07a3-4a7a-b124-b3a98c1d0749-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.593546 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.593651 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.593741 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.593831 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.593659 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.593897 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-run\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594009 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-dev\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594070 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594103 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-dev\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594119 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594146 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594168 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73e60923-2cfb-4f00-adf0-ace27b9623f0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594174 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594221 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73e60923-2cfb-4f00-adf0-ace27b9623f0-scripts\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594239 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73e60923-2cfb-4f00-adf0-ace27b9623f0-config-data\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594328 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-run\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594380 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6cq5\" (UniqueName: \"kubernetes.io/projected/73e60923-2cfb-4f00-adf0-ace27b9623f0-kube-api-access-m6cq5\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594434 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a08173f-07a3-4a7a-b124-b3a98c1d0749-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594436 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-run\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594478 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e60923-2cfb-4f00-adf0-ace27b9623f0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594514 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpkdl\" (UniqueName: \"kubernetes.io/projected/4a08173f-07a3-4a7a-b124-b3a98c1d0749-kube-api-access-kpkdl\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594589 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-dev\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594618 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594646 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594687 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-sys\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594719 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a08173f-07a3-4a7a-b124-b3a98c1d0749-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594742 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594766 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594827 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a08173f-07a3-4a7a-b124-b3a98c1d0749-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594860 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4a08173f-07a3-4a7a-b124-b3a98c1d0749-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594903 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594957 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-lib-modules\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.594992 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.595008 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.595021 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-sys\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.595052 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-sys\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.595059 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/73e60923-2cfb-4f00-adf0-ace27b9623f0-ceph\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.597499 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.597783 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.597923 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4a08173f-07a3-4a7a-b124-b3a98c1d0749-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.600114 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a08173f-07a3-4a7a-b124-b3a98c1d0749-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.601823 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4a08173f-07a3-4a7a-b124-b3a98c1d0749-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.602152 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a08173f-07a3-4a7a-b124-b3a98c1d0749-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.612557 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a08173f-07a3-4a7a-b124-b3a98c1d0749-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.612843 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a08173f-07a3-4a7a-b124-b3a98c1d0749-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.625390 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpkdl\" (UniqueName: \"kubernetes.io/projected/4a08173f-07a3-4a7a-b124-b3a98c1d0749-kube-api-access-kpkdl\") pod \"cinder-volume-volume1-0\" (UID: \"4a08173f-07a3-4a7a-b124-b3a98c1d0749\") " pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.696752 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73e60923-2cfb-4f00-adf0-ace27b9623f0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.696830 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73e60923-2cfb-4f00-adf0-ace27b9623f0-scripts\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.696846 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73e60923-2cfb-4f00-adf0-ace27b9623f0-config-data\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.696896 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6cq5\" (UniqueName: \"kubernetes.io/projected/73e60923-2cfb-4f00-adf0-ace27b9623f0-kube-api-access-m6cq5\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.696923 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e60923-2cfb-4f00-adf0-ace27b9623f0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.696954 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-dev\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.696970 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697001 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-sys\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697046 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-lib-modules\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697070 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697103 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/73e60923-2cfb-4f00-adf0-ace27b9623f0-ceph\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697133 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697166 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697182 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697200 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697215 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-run\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697295 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-run\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697316 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-sys\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697331 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-lib-modules\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697407 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697764 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697812 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697850 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.697846 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.698187 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-dev\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.698231 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73e60923-2cfb-4f00-adf0-ace27b9623f0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.700332 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73e60923-2cfb-4f00-adf0-ace27b9623f0-scripts\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.700981 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73e60923-2cfb-4f00-adf0-ace27b9623f0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.701478 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73e60923-2cfb-4f00-adf0-ace27b9623f0-config-data\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.702043 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/73e60923-2cfb-4f00-adf0-ace27b9623f0-ceph\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.702855 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e60923-2cfb-4f00-adf0-ace27b9623f0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.717557 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6cq5\" (UniqueName: \"kubernetes.io/projected/73e60923-2cfb-4f00-adf0-ace27b9623f0-kube-api-access-m6cq5\") pod \"cinder-backup-0\" (UID: \"73e60923-2cfb-4f00-adf0-ace27b9623f0\") " pod="openstack/cinder-backup-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.792481 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:12 crc kubenswrapper[4992]: I0131 10:14:12.833057 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.015079 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-wd7m9"] Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.016735 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wd7m9" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.023609 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-wd7m9"] Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.106716 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-eb4a-account-create-update-7lc59"] Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.107294 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tknh2\" (UniqueName: \"kubernetes.io/projected/34b795bf-45ca-4c3c-84a7-39a764219cc2-kube-api-access-tknh2\") pod \"manila-db-create-wd7m9\" (UID: \"34b795bf-45ca-4c3c-84a7-39a764219cc2\") " pod="openstack/manila-db-create-wd7m9" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.107572 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34b795bf-45ca-4c3c-84a7-39a764219cc2-operator-scripts\") pod \"manila-db-create-wd7m9\" (UID: \"34b795bf-45ca-4c3c-84a7-39a764219cc2\") " pod="openstack/manila-db-create-wd7m9" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.107932 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-eb4a-account-create-update-7lc59" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.110780 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.127515 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-eb4a-account-create-update-7lc59"] Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.209129 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j29kg\" (UniqueName: \"kubernetes.io/projected/94de8369-bb29-499c-b221-bb53527a84e2-kube-api-access-j29kg\") pod \"manila-eb4a-account-create-update-7lc59\" (UID: \"94de8369-bb29-499c-b221-bb53527a84e2\") " pod="openstack/manila-eb4a-account-create-update-7lc59" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.209198 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34b795bf-45ca-4c3c-84a7-39a764219cc2-operator-scripts\") pod \"manila-db-create-wd7m9\" (UID: \"34b795bf-45ca-4c3c-84a7-39a764219cc2\") " pod="openstack/manila-db-create-wd7m9" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.209292 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tknh2\" (UniqueName: \"kubernetes.io/projected/34b795bf-45ca-4c3c-84a7-39a764219cc2-kube-api-access-tknh2\") pod \"manila-db-create-wd7m9\" (UID: \"34b795bf-45ca-4c3c-84a7-39a764219cc2\") " pod="openstack/manila-db-create-wd7m9" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.209326 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94de8369-bb29-499c-b221-bb53527a84e2-operator-scripts\") pod \"manila-eb4a-account-create-update-7lc59\" (UID: \"94de8369-bb29-499c-b221-bb53527a84e2\") " pod="openstack/manila-eb4a-account-create-update-7lc59" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.210273 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34b795bf-45ca-4c3c-84a7-39a764219cc2-operator-scripts\") pod \"manila-db-create-wd7m9\" (UID: \"34b795bf-45ca-4c3c-84a7-39a764219cc2\") " pod="openstack/manila-db-create-wd7m9" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.225634 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tknh2\" (UniqueName: \"kubernetes.io/projected/34b795bf-45ca-4c3c-84a7-39a764219cc2-kube-api-access-tknh2\") pod \"manila-db-create-wd7m9\" (UID: \"34b795bf-45ca-4c3c-84a7-39a764219cc2\") " pod="openstack/manila-db-create-wd7m9" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.287347 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.289951 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.293192 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-g4lhv" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.293714 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.293764 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.294884 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.300976 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.312135 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94de8369-bb29-499c-b221-bb53527a84e2-operator-scripts\") pod \"manila-eb4a-account-create-update-7lc59\" (UID: \"94de8369-bb29-499c-b221-bb53527a84e2\") " pod="openstack/manila-eb4a-account-create-update-7lc59" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.312240 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j29kg\" (UniqueName: \"kubernetes.io/projected/94de8369-bb29-499c-b221-bb53527a84e2-kube-api-access-j29kg\") pod \"manila-eb4a-account-create-update-7lc59\" (UID: \"94de8369-bb29-499c-b221-bb53527a84e2\") " pod="openstack/manila-eb4a-account-create-update-7lc59" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.313669 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94de8369-bb29-499c-b221-bb53527a84e2-operator-scripts\") pod \"manila-eb4a-account-create-update-7lc59\" (UID: \"94de8369-bb29-499c-b221-bb53527a84e2\") " pod="openstack/manila-eb4a-account-create-update-7lc59" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.333626 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j29kg\" (UniqueName: \"kubernetes.io/projected/94de8369-bb29-499c-b221-bb53527a84e2-kube-api-access-j29kg\") pod \"manila-eb4a-account-create-update-7lc59\" (UID: \"94de8369-bb29-499c-b221-bb53527a84e2\") " pod="openstack/manila-eb4a-account-create-update-7lc59" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.337164 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.343984 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.347024 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.349247 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wd7m9" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.354945 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.366008 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.413855 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c52617-d568-4cd4-8cf1-e5b02737770f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.414001 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.414077 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqsk9\" (UniqueName: \"kubernetes.io/projected/a3c52617-d568-4cd4-8cf1-e5b02737770f-kube-api-access-xqsk9\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.414112 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c52617-d568-4cd4-8cf1-e5b02737770f-scripts\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.414153 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3c52617-d568-4cd4-8cf1-e5b02737770f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.414177 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a3c52617-d568-4cd4-8cf1-e5b02737770f-ceph\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.414208 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3c52617-d568-4cd4-8cf1-e5b02737770f-logs\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.414245 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a3c52617-d568-4cd4-8cf1-e5b02737770f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.414273 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c52617-d568-4cd4-8cf1-e5b02737770f-config-data\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.430860 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-eb4a-account-create-update-7lc59" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.439929 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.516385 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3c52617-d568-4cd4-8cf1-e5b02737770f-logs\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.516449 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/682536c3-9edb-474b-9854-de0383d1c7f6-logs\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.516479 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a3c52617-d568-4cd4-8cf1-e5b02737770f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.516499 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682536c3-9edb-474b-9854-de0383d1c7f6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.516527 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c52617-d568-4cd4-8cf1-e5b02737770f-config-data\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.516554 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/682536c3-9edb-474b-9854-de0383d1c7f6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.516569 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/682536c3-9edb-474b-9854-de0383d1c7f6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.517682 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/682536c3-9edb-474b-9854-de0383d1c7f6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.517760 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/682536c3-9edb-474b-9854-de0383d1c7f6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.517799 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c52617-d568-4cd4-8cf1-e5b02737770f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.517885 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/682536c3-9edb-474b-9854-de0383d1c7f6-ceph\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.517006 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3c52617-d568-4cd4-8cf1-e5b02737770f-logs\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.517932 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.520031 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.520068 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8rjq\" (UniqueName: \"kubernetes.io/projected/682536c3-9edb-474b-9854-de0383d1c7f6-kube-api-access-v8rjq\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.520142 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqsk9\" (UniqueName: \"kubernetes.io/projected/a3c52617-d568-4cd4-8cf1-e5b02737770f-kube-api-access-xqsk9\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.517050 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a3c52617-d568-4cd4-8cf1-e5b02737770f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.518232 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.521977 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c52617-d568-4cd4-8cf1-e5b02737770f-scripts\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.522060 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3c52617-d568-4cd4-8cf1-e5b02737770f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.522094 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a3c52617-d568-4cd4-8cf1-e5b02737770f-ceph\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.525581 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c52617-d568-4cd4-8cf1-e5b02737770f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.526582 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c52617-d568-4cd4-8cf1-e5b02737770f-config-data\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.527498 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3c52617-d568-4cd4-8cf1-e5b02737770f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.528230 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a3c52617-d568-4cd4-8cf1-e5b02737770f-ceph\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.531939 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c52617-d568-4cd4-8cf1-e5b02737770f-scripts\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.536714 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqsk9\" (UniqueName: \"kubernetes.io/projected/a3c52617-d568-4cd4-8cf1-e5b02737770f-kube-api-access-xqsk9\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.538975 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.563015 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a3c52617-d568-4cd4-8cf1-e5b02737770f\") " pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.605918 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.624060 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/682536c3-9edb-474b-9854-de0383d1c7f6-ceph\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.624310 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.624375 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8rjq\" (UniqueName: \"kubernetes.io/projected/682536c3-9edb-474b-9854-de0383d1c7f6-kube-api-access-v8rjq\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.624480 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/682536c3-9edb-474b-9854-de0383d1c7f6-logs\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.624509 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682536c3-9edb-474b-9854-de0383d1c7f6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.624537 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/682536c3-9edb-474b-9854-de0383d1c7f6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.624552 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/682536c3-9edb-474b-9854-de0383d1c7f6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.624590 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/682536c3-9edb-474b-9854-de0383d1c7f6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.624619 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/682536c3-9edb-474b-9854-de0383d1c7f6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.625568 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/682536c3-9edb-474b-9854-de0383d1c7f6-logs\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.628004 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/682536c3-9edb-474b-9854-de0383d1c7f6-ceph\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.628150 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/682536c3-9edb-474b-9854-de0383d1c7f6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.628508 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/682536c3-9edb-474b-9854-de0383d1c7f6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.628693 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.635543 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/682536c3-9edb-474b-9854-de0383d1c7f6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.643223 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/682536c3-9edb-474b-9854-de0383d1c7f6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.644771 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682536c3-9edb-474b-9854-de0383d1c7f6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.647852 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8rjq\" (UniqueName: \"kubernetes.io/projected/682536c3-9edb-474b-9854-de0383d1c7f6-kube-api-access-v8rjq\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.675642 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"682536c3-9edb-474b-9854-de0383d1c7f6\") " pod="openstack/glance-default-internal-api-0" Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.835951 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-wd7m9"] Jan 31 10:14:13 crc kubenswrapper[4992]: I0131 10:14:13.976097 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 31 10:14:14 crc kubenswrapper[4992]: I0131 10:14:14.029504 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-eb4a-account-create-update-7lc59"] Jan 31 10:14:14 crc kubenswrapper[4992]: W0131 10:14:14.042803 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94de8369_bb29_499c_b221_bb53527a84e2.slice/crio-f3b62f477e4cda6ce0274045334005dcf8fcc02eff70e6fbf293217e5a092f37 WatchSource:0}: Error finding container f3b62f477e4cda6ce0274045334005dcf8fcc02eff70e6fbf293217e5a092f37: Status 404 returned error can't find the container with id f3b62f477e4cda6ce0274045334005dcf8fcc02eff70e6fbf293217e5a092f37 Jan 31 10:14:14 crc kubenswrapper[4992]: I0131 10:14:14.202610 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 31 10:14:14 crc kubenswrapper[4992]: I0131 10:14:14.261980 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"4a08173f-07a3-4a7a-b124-b3a98c1d0749","Type":"ContainerStarted","Data":"462be030466213b5d40d8c358b2b9afe383c84407cafa9ebb630d28a35ecf088"} Jan 31 10:14:14 crc kubenswrapper[4992]: I0131 10:14:14.268881 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-eb4a-account-create-update-7lc59" event={"ID":"94de8369-bb29-499c-b221-bb53527a84e2","Type":"ContainerStarted","Data":"f3b62f477e4cda6ce0274045334005dcf8fcc02eff70e6fbf293217e5a092f37"} Jan 31 10:14:14 crc kubenswrapper[4992]: I0131 10:14:14.273314 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wd7m9" event={"ID":"34b795bf-45ca-4c3c-84a7-39a764219cc2","Type":"ContainerStarted","Data":"b5a89fb17016cdaeb2b2de355d0a00a38d2e7b25eff561438ee8017c8a3ac40b"} Jan 31 10:14:14 crc kubenswrapper[4992]: I0131 10:14:14.273357 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wd7m9" event={"ID":"34b795bf-45ca-4c3c-84a7-39a764219cc2","Type":"ContainerStarted","Data":"60edf52a06f8985a247261ee835b60d9a3f310b36c62ec6dd27af8908e6e6a2c"} Jan 31 10:14:14 crc kubenswrapper[4992]: I0131 10:14:14.281183 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"73e60923-2cfb-4f00-adf0-ace27b9623f0","Type":"ContainerStarted","Data":"dce25b0ca1045089775607df09de17834d78c0ae30c3e2cf370168350ea6eedc"} Jan 31 10:14:14 crc kubenswrapper[4992]: W0131 10:14:14.334369 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3c52617_d568_4cd4_8cf1_e5b02737770f.slice/crio-e2ebbd0f9f90ff07b5b6804c38630b7a813626d365a8eff44fd00694283919a8 WatchSource:0}: Error finding container e2ebbd0f9f90ff07b5b6804c38630b7a813626d365a8eff44fd00694283919a8: Status 404 returned error can't find the container with id e2ebbd0f9f90ff07b5b6804c38630b7a813626d365a8eff44fd00694283919a8 Jan 31 10:14:14 crc kubenswrapper[4992]: I0131 10:14:14.548951 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.289820 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"4a08173f-07a3-4a7a-b124-b3a98c1d0749","Type":"ContainerStarted","Data":"0ff37d2c4e94f91543635121bb556584c653bfac4fbf9889e03c34bd2b5fbf69"} Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.290390 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"4a08173f-07a3-4a7a-b124-b3a98c1d0749","Type":"ContainerStarted","Data":"3b891466b529c6b4fd9e466317c22f55d83de39760a7d137567747a6954d5a13"} Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.291575 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"682536c3-9edb-474b-9854-de0383d1c7f6","Type":"ContainerStarted","Data":"de8ef958a4e2f6bae0bebcd1cc8539a5c9ea34e3df26a67d662ccc20c370e6b2"} Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.291723 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"682536c3-9edb-474b-9854-de0383d1c7f6","Type":"ContainerStarted","Data":"ee8a3398bd321b4a7bebc092da0b94c549e2c58793b7682f606b645fb578c803"} Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.295882 4992 generic.go:334] "Generic (PLEG): container finished" podID="94de8369-bb29-499c-b221-bb53527a84e2" containerID="c1cfe949603112a8d23a868a9c4a3f01b6c36115c62a39701c09e41de5708d82" exitCode=0 Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.295977 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-eb4a-account-create-update-7lc59" event={"ID":"94de8369-bb29-499c-b221-bb53527a84e2","Type":"ContainerDied","Data":"c1cfe949603112a8d23a868a9c4a3f01b6c36115c62a39701c09e41de5708d82"} Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.297283 4992 generic.go:334] "Generic (PLEG): container finished" podID="34b795bf-45ca-4c3c-84a7-39a764219cc2" containerID="b5a89fb17016cdaeb2b2de355d0a00a38d2e7b25eff561438ee8017c8a3ac40b" exitCode=0 Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.297330 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wd7m9" event={"ID":"34b795bf-45ca-4c3c-84a7-39a764219cc2","Type":"ContainerDied","Data":"b5a89fb17016cdaeb2b2de355d0a00a38d2e7b25eff561438ee8017c8a3ac40b"} Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.304612 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.304671 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.306064 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"73e60923-2cfb-4f00-adf0-ace27b9623f0","Type":"ContainerStarted","Data":"2d23ce6899abae7dc349f9e708538de92755930123da3e941929f750b8f93ba3"} Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.306120 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"73e60923-2cfb-4f00-adf0-ace27b9623f0","Type":"ContainerStarted","Data":"f1ac695aed09232531face0c8de745023c6de899d41806a93fbbbc03f6235605"} Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.307445 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a3c52617-d568-4cd4-8cf1-e5b02737770f","Type":"ContainerStarted","Data":"965baae97b14bef6f676417ebe4c4762e327459138d78576b114b3cb6b5b8f17"} Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.307559 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a3c52617-d568-4cd4-8cf1-e5b02737770f","Type":"ContainerStarted","Data":"e2ebbd0f9f90ff07b5b6804c38630b7a813626d365a8eff44fd00694283919a8"} Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.316392 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.373309308 podStartE2EDuration="3.316375523s" podCreationTimestamp="2026-01-31 10:14:12 +0000 UTC" firstStartedPulling="2026-01-31 10:14:13.48719337 +0000 UTC m=+2949.458585357" lastFinishedPulling="2026-01-31 10:14:14.430259545 +0000 UTC m=+2950.401651572" observedRunningTime="2026-01-31 10:14:15.31071344 +0000 UTC m=+2951.282105447" watchObservedRunningTime="2026-01-31 10:14:15.316375523 +0000 UTC m=+2951.287767510" Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.361440 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.450449635 podStartE2EDuration="3.361407487s" podCreationTimestamp="2026-01-31 10:14:12 +0000 UTC" firstStartedPulling="2026-01-31 10:14:13.541302245 +0000 UTC m=+2949.512694232" lastFinishedPulling="2026-01-31 10:14:14.452260097 +0000 UTC m=+2950.423652084" observedRunningTime="2026-01-31 10:14:15.351923854 +0000 UTC m=+2951.323315861" watchObservedRunningTime="2026-01-31 10:14:15.361407487 +0000 UTC m=+2951.332799474" Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.735533 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wd7m9" Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.877911 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tknh2\" (UniqueName: \"kubernetes.io/projected/34b795bf-45ca-4c3c-84a7-39a764219cc2-kube-api-access-tknh2\") pod \"34b795bf-45ca-4c3c-84a7-39a764219cc2\" (UID: \"34b795bf-45ca-4c3c-84a7-39a764219cc2\") " Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.878021 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34b795bf-45ca-4c3c-84a7-39a764219cc2-operator-scripts\") pod \"34b795bf-45ca-4c3c-84a7-39a764219cc2\" (UID: \"34b795bf-45ca-4c3c-84a7-39a764219cc2\") " Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.878832 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34b795bf-45ca-4c3c-84a7-39a764219cc2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "34b795bf-45ca-4c3c-84a7-39a764219cc2" (UID: "34b795bf-45ca-4c3c-84a7-39a764219cc2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.886184 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34b795bf-45ca-4c3c-84a7-39a764219cc2-kube-api-access-tknh2" (OuterVolumeSpecName: "kube-api-access-tknh2") pod "34b795bf-45ca-4c3c-84a7-39a764219cc2" (UID: "34b795bf-45ca-4c3c-84a7-39a764219cc2"). InnerVolumeSpecName "kube-api-access-tknh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.980748 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tknh2\" (UniqueName: \"kubernetes.io/projected/34b795bf-45ca-4c3c-84a7-39a764219cc2-kube-api-access-tknh2\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:15 crc kubenswrapper[4992]: I0131 10:14:15.980775 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34b795bf-45ca-4c3c-84a7-39a764219cc2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:16 crc kubenswrapper[4992]: I0131 10:14:16.320010 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a3c52617-d568-4cd4-8cf1-e5b02737770f","Type":"ContainerStarted","Data":"19feff427e4340210434eb8b3ea4f8a99573c3f0b3ffcefd5501a5639e3029d2"} Jan 31 10:14:16 crc kubenswrapper[4992]: I0131 10:14:16.321805 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"682536c3-9edb-474b-9854-de0383d1c7f6","Type":"ContainerStarted","Data":"0f4b7b02b95c6f9cba1d8df491b7810b93c2799ad0202ecfb0962ad40e15a7d1"} Jan 31 10:14:16 crc kubenswrapper[4992]: I0131 10:14:16.324128 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wd7m9" event={"ID":"34b795bf-45ca-4c3c-84a7-39a764219cc2","Type":"ContainerDied","Data":"60edf52a06f8985a247261ee835b60d9a3f310b36c62ec6dd27af8908e6e6a2c"} Jan 31 10:14:16 crc kubenswrapper[4992]: I0131 10:14:16.324172 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60edf52a06f8985a247261ee835b60d9a3f310b36c62ec6dd27af8908e6e6a2c" Jan 31 10:14:16 crc kubenswrapper[4992]: I0131 10:14:16.324182 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wd7m9" Jan 31 10:14:16 crc kubenswrapper[4992]: I0131 10:14:16.353556 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.353536453 podStartE2EDuration="4.353536453s" podCreationTimestamp="2026-01-31 10:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 10:14:16.345491691 +0000 UTC m=+2952.316883688" watchObservedRunningTime="2026-01-31 10:14:16.353536453 +0000 UTC m=+2952.324928440" Jan 31 10:14:16 crc kubenswrapper[4992]: I0131 10:14:16.374385 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.374365531 podStartE2EDuration="4.374365531s" podCreationTimestamp="2026-01-31 10:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 10:14:16.370881051 +0000 UTC m=+2952.342273048" watchObservedRunningTime="2026-01-31 10:14:16.374365531 +0000 UTC m=+2952.345757518" Jan 31 10:14:16 crc kubenswrapper[4992]: I0131 10:14:16.740062 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-eb4a-account-create-update-7lc59" Jan 31 10:14:16 crc kubenswrapper[4992]: I0131 10:14:16.796190 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j29kg\" (UniqueName: \"kubernetes.io/projected/94de8369-bb29-499c-b221-bb53527a84e2-kube-api-access-j29kg\") pod \"94de8369-bb29-499c-b221-bb53527a84e2\" (UID: \"94de8369-bb29-499c-b221-bb53527a84e2\") " Jan 31 10:14:16 crc kubenswrapper[4992]: I0131 10:14:16.796258 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94de8369-bb29-499c-b221-bb53527a84e2-operator-scripts\") pod \"94de8369-bb29-499c-b221-bb53527a84e2\" (UID: \"94de8369-bb29-499c-b221-bb53527a84e2\") " Jan 31 10:14:16 crc kubenswrapper[4992]: I0131 10:14:16.797656 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94de8369-bb29-499c-b221-bb53527a84e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94de8369-bb29-499c-b221-bb53527a84e2" (UID: "94de8369-bb29-499c-b221-bb53527a84e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:14:16 crc kubenswrapper[4992]: I0131 10:14:16.819282 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94de8369-bb29-499c-b221-bb53527a84e2-kube-api-access-j29kg" (OuterVolumeSpecName: "kube-api-access-j29kg") pod "94de8369-bb29-499c-b221-bb53527a84e2" (UID: "94de8369-bb29-499c-b221-bb53527a84e2"). InnerVolumeSpecName "kube-api-access-j29kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:14:16 crc kubenswrapper[4992]: I0131 10:14:16.898735 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j29kg\" (UniqueName: \"kubernetes.io/projected/94de8369-bb29-499c-b221-bb53527a84e2-kube-api-access-j29kg\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:16 crc kubenswrapper[4992]: I0131 10:14:16.899082 4992 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94de8369-bb29-499c-b221-bb53527a84e2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:17 crc kubenswrapper[4992]: I0131 10:14:17.336626 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-eb4a-account-create-update-7lc59" event={"ID":"94de8369-bb29-499c-b221-bb53527a84e2","Type":"ContainerDied","Data":"f3b62f477e4cda6ce0274045334005dcf8fcc02eff70e6fbf293217e5a092f37"} Jan 31 10:14:17 crc kubenswrapper[4992]: I0131 10:14:17.336671 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-eb4a-account-create-update-7lc59" Jan 31 10:14:17 crc kubenswrapper[4992]: I0131 10:14:17.336679 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3b62f477e4cda6ce0274045334005dcf8fcc02eff70e6fbf293217e5a092f37" Jan 31 10:14:17 crc kubenswrapper[4992]: I0131 10:14:17.793370 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:17 crc kubenswrapper[4992]: I0131 10:14:17.834182 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.405843 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-s9dt7"] Jan 31 10:14:18 crc kubenswrapper[4992]: E0131 10:14:18.406626 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94de8369-bb29-499c-b221-bb53527a84e2" containerName="mariadb-account-create-update" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.406650 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="94de8369-bb29-499c-b221-bb53527a84e2" containerName="mariadb-account-create-update" Jan 31 10:14:18 crc kubenswrapper[4992]: E0131 10:14:18.406672 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b795bf-45ca-4c3c-84a7-39a764219cc2" containerName="mariadb-database-create" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.406680 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b795bf-45ca-4c3c-84a7-39a764219cc2" containerName="mariadb-database-create" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.406889 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="94de8369-bb29-499c-b221-bb53527a84e2" containerName="mariadb-account-create-update" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.406920 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b795bf-45ca-4c3c-84a7-39a764219cc2" containerName="mariadb-database-create" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.407719 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.410154 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.410349 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-xg5rz" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.427923 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-s9dt7"] Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.531860 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-config-data\") pod \"manila-db-sync-s9dt7\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.531928 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-combined-ca-bundle\") pod \"manila-db-sync-s9dt7\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.532003 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7m8d\" (UniqueName: \"kubernetes.io/projected/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-kube-api-access-f7m8d\") pod \"manila-db-sync-s9dt7\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.532069 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-job-config-data\") pod \"manila-db-sync-s9dt7\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.636659 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-job-config-data\") pod \"manila-db-sync-s9dt7\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.636867 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-config-data\") pod \"manila-db-sync-s9dt7\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.636922 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-combined-ca-bundle\") pod \"manila-db-sync-s9dt7\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.637035 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7m8d\" (UniqueName: \"kubernetes.io/projected/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-kube-api-access-f7m8d\") pod \"manila-db-sync-s9dt7\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.641439 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-job-config-data\") pod \"manila-db-sync-s9dt7\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.641469 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-combined-ca-bundle\") pod \"manila-db-sync-s9dt7\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.643104 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-config-data\") pod \"manila-db-sync-s9dt7\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.657149 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7m8d\" (UniqueName: \"kubernetes.io/projected/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-kube-api-access-f7m8d\") pod \"manila-db-sync-s9dt7\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:18 crc kubenswrapper[4992]: I0131 10:14:18.740318 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:19 crc kubenswrapper[4992]: I0131 10:14:19.305675 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-s9dt7"] Jan 31 10:14:19 crc kubenswrapper[4992]: W0131 10:14:19.327623 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90bb7b32_ced9_4f29_8649_ceb6f46b89e5.slice/crio-d67d1fcbcfb3898cc5d4a4efd15a3263a849f4302750a65d7456af9362caf3be WatchSource:0}: Error finding container d67d1fcbcfb3898cc5d4a4efd15a3263a849f4302750a65d7456af9362caf3be: Status 404 returned error can't find the container with id d67d1fcbcfb3898cc5d4a4efd15a3263a849f4302750a65d7456af9362caf3be Jan 31 10:14:19 crc kubenswrapper[4992]: I0131 10:14:19.356385 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-s9dt7" event={"ID":"90bb7b32-ced9-4f29-8649-ceb6f46b89e5","Type":"ContainerStarted","Data":"d67d1fcbcfb3898cc5d4a4efd15a3263a849f4302750a65d7456af9362caf3be"} Jan 31 10:14:22 crc kubenswrapper[4992]: I0131 10:14:22.998771 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Jan 31 10:14:23 crc kubenswrapper[4992]: I0131 10:14:23.049218 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Jan 31 10:14:23 crc kubenswrapper[4992]: I0131 10:14:23.606853 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 31 10:14:23 crc kubenswrapper[4992]: I0131 10:14:23.606889 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 31 10:14:23 crc kubenswrapper[4992]: I0131 10:14:23.646295 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 31 10:14:23 crc kubenswrapper[4992]: I0131 10:14:23.666100 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 31 10:14:23 crc kubenswrapper[4992]: I0131 10:14:23.976590 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 31 10:14:23 crc kubenswrapper[4992]: I0131 10:14:23.976947 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 31 10:14:24 crc kubenswrapper[4992]: I0131 10:14:24.025566 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 31 10:14:24 crc kubenswrapper[4992]: I0131 10:14:24.040718 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 31 10:14:24 crc kubenswrapper[4992]: I0131 10:14:24.402825 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-s9dt7" event={"ID":"90bb7b32-ced9-4f29-8649-ceb6f46b89e5","Type":"ContainerStarted","Data":"2b36f4437cd102392de284aeabdf3bfd255dbbfce36dbded673cc8a3a3e9f180"} Jan 31 10:14:24 crc kubenswrapper[4992]: I0131 10:14:24.403814 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 31 10:14:24 crc kubenswrapper[4992]: I0131 10:14:24.403846 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 31 10:14:24 crc kubenswrapper[4992]: I0131 10:14:24.403860 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 31 10:14:24 crc kubenswrapper[4992]: I0131 10:14:24.403872 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 31 10:14:24 crc kubenswrapper[4992]: I0131 10:14:24.420912 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-s9dt7" podStartSLOduration=1.896480581 podStartE2EDuration="6.4208944s" podCreationTimestamp="2026-01-31 10:14:18 +0000 UTC" firstStartedPulling="2026-01-31 10:14:19.336986571 +0000 UTC m=+2955.308378558" lastFinishedPulling="2026-01-31 10:14:23.86140038 +0000 UTC m=+2959.832792377" observedRunningTime="2026-01-31 10:14:24.418350277 +0000 UTC m=+2960.389742284" watchObservedRunningTime="2026-01-31 10:14:24.4208944 +0000 UTC m=+2960.392286387" Jan 31 10:14:27 crc kubenswrapper[4992]: I0131 10:14:27.016941 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 31 10:14:27 crc kubenswrapper[4992]: I0131 10:14:27.017402 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 31 10:14:27 crc kubenswrapper[4992]: I0131 10:14:27.042996 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 31 10:14:27 crc kubenswrapper[4992]: I0131 10:14:27.043114 4992 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 10:14:27 crc kubenswrapper[4992]: I0131 10:14:27.049675 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 31 10:14:34 crc kubenswrapper[4992]: I0131 10:14:34.499188 4992 generic.go:334] "Generic (PLEG): container finished" podID="90bb7b32-ced9-4f29-8649-ceb6f46b89e5" containerID="2b36f4437cd102392de284aeabdf3bfd255dbbfce36dbded673cc8a3a3e9f180" exitCode=0 Jan 31 10:14:34 crc kubenswrapper[4992]: I0131 10:14:34.499276 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-s9dt7" event={"ID":"90bb7b32-ced9-4f29-8649-ceb6f46b89e5","Type":"ContainerDied","Data":"2b36f4437cd102392de284aeabdf3bfd255dbbfce36dbded673cc8a3a3e9f180"} Jan 31 10:14:35 crc kubenswrapper[4992]: I0131 10:14:35.962134 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.107578 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-job-config-data\") pod \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.107718 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7m8d\" (UniqueName: \"kubernetes.io/projected/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-kube-api-access-f7m8d\") pod \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.107773 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-combined-ca-bundle\") pod \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.107863 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-config-data\") pod \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\" (UID: \"90bb7b32-ced9-4f29-8649-ceb6f46b89e5\") " Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.119566 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "90bb7b32-ced9-4f29-8649-ceb6f46b89e5" (UID: "90bb7b32-ced9-4f29-8649-ceb6f46b89e5"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.119584 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-kube-api-access-f7m8d" (OuterVolumeSpecName: "kube-api-access-f7m8d") pod "90bb7b32-ced9-4f29-8649-ceb6f46b89e5" (UID: "90bb7b32-ced9-4f29-8649-ceb6f46b89e5"). InnerVolumeSpecName "kube-api-access-f7m8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.122271 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-config-data" (OuterVolumeSpecName: "config-data") pod "90bb7b32-ced9-4f29-8649-ceb6f46b89e5" (UID: "90bb7b32-ced9-4f29-8649-ceb6f46b89e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.132305 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90bb7b32-ced9-4f29-8649-ceb6f46b89e5" (UID: "90bb7b32-ced9-4f29-8649-ceb6f46b89e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.210475 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7m8d\" (UniqueName: \"kubernetes.io/projected/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-kube-api-access-f7m8d\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.210509 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.210519 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.210527 4992 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/90bb7b32-ced9-4f29-8649-ceb6f46b89e5-job-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.515458 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-s9dt7" event={"ID":"90bb7b32-ced9-4f29-8649-ceb6f46b89e5","Type":"ContainerDied","Data":"d67d1fcbcfb3898cc5d4a4efd15a3263a849f4302750a65d7456af9362caf3be"} Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.515491 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-s9dt7" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.515500 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d67d1fcbcfb3898cc5d4a4efd15a3263a849f4302750a65d7456af9362caf3be" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.887882 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Jan 31 10:14:36 crc kubenswrapper[4992]: E0131 10:14:36.888520 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90bb7b32-ced9-4f29-8649-ceb6f46b89e5" containerName="manila-db-sync" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.888534 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="90bb7b32-ced9-4f29-8649-ceb6f46b89e5" containerName="manila-db-sync" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.888751 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="90bb7b32-ced9-4f29-8649-ceb6f46b89e5" containerName="manila-db-sync" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.889758 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.893529 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.897156 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.897280 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.897412 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-xg5rz" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.928613 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.931952 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.937148 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.964603 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 31 10:14:36 crc kubenswrapper[4992]: I0131 10:14:36.999604 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.041218 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69655fd4bf-5tqkp"] Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.043136 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.057821 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.057866 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnvc7\" (UniqueName: \"kubernetes.io/projected/60b04c6c-582c-481e-899e-c2df2f6229d3-kube-api-access-tnvc7\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.058444 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-config-data\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.058482 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t265v\" (UniqueName: \"kubernetes.io/projected/d19358db-cbfe-43cc-8305-b2526c0c8fd2-kube-api-access-t265v\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.058515 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.058535 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-config-data\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.058560 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/60b04c6c-582c-481e-899e-c2df2f6229d3-ceph\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.058580 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.058700 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-scripts\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.058742 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-scripts\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.058820 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60b04c6c-582c-481e-899e-c2df2f6229d3-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.058843 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.058888 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d19358db-cbfe-43cc-8305-b2526c0c8fd2-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.059518 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/60b04c6c-582c-481e-899e-c2df2f6229d3-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.059996 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69655fd4bf-5tqkp"] Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.128529 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.130286 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.136935 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.144118 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163541 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e7aee525-4be4-45af-9c7c-543f25591ff9-openstack-edpm-ipam\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163608 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-scripts\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163653 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163687 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-logs\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163713 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnvc7\" (UniqueName: \"kubernetes.io/projected/60b04c6c-582c-481e-899e-c2df2f6229d3-kube-api-access-tnvc7\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163740 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7aee525-4be4-45af-9c7c-543f25591ff9-ovsdbserver-nb\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163764 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7aee525-4be4-45af-9c7c-543f25591ff9-dns-svc\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163784 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwtnm\" (UniqueName: \"kubernetes.io/projected/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-kube-api-access-xwtnm\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163802 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-config-data-custom\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163835 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-config-data\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163850 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t265v\" (UniqueName: \"kubernetes.io/projected/d19358db-cbfe-43cc-8305-b2526c0c8fd2-kube-api-access-t265v\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163876 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163898 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-config-data\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163922 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/60b04c6c-582c-481e-899e-c2df2f6229d3-ceph\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163941 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163959 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-etc-machine-id\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163980 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-scripts\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.163997 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.164028 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-scripts\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.164049 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw6dv\" (UniqueName: \"kubernetes.io/projected/e7aee525-4be4-45af-9c7c-543f25591ff9-kube-api-access-qw6dv\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.164069 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60b04c6c-582c-481e-899e-c2df2f6229d3-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.164084 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.164107 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7aee525-4be4-45af-9c7c-543f25591ff9-config\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.164138 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-config-data\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.164155 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d19358db-cbfe-43cc-8305-b2526c0c8fd2-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.164179 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/60b04c6c-582c-481e-899e-c2df2f6229d3-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.164196 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7aee525-4be4-45af-9c7c-543f25591ff9-ovsdbserver-sb\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.165160 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60b04c6c-582c-481e-899e-c2df2f6229d3-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.170525 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d19358db-cbfe-43cc-8305-b2526c0c8fd2-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.171157 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/60b04c6c-582c-481e-899e-c2df2f6229d3-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.175258 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-scripts\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.177230 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-config-data\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.177581 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/60b04c6c-582c-481e-899e-c2df2f6229d3-ceph\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.181031 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-scripts\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.181699 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.185351 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.185876 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.186260 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnvc7\" (UniqueName: \"kubernetes.io/projected/60b04c6c-582c-481e-899e-c2df2f6229d3-kube-api-access-tnvc7\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.187007 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.190158 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-config-data\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.193096 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t265v\" (UniqueName: \"kubernetes.io/projected/d19358db-cbfe-43cc-8305-b2526c0c8fd2-kube-api-access-t265v\") pod \"manila-scheduler-0\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.210845 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.259925 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.265327 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-logs\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.265394 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7aee525-4be4-45af-9c7c-543f25591ff9-ovsdbserver-nb\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.265441 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7aee525-4be4-45af-9c7c-543f25591ff9-dns-svc\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.265461 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwtnm\" (UniqueName: \"kubernetes.io/projected/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-kube-api-access-xwtnm\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.265477 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-config-data-custom\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.265554 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-etc-machine-id\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.265580 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.265615 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw6dv\" (UniqueName: \"kubernetes.io/projected/e7aee525-4be4-45af-9c7c-543f25591ff9-kube-api-access-qw6dv\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.265642 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7aee525-4be4-45af-9c7c-543f25591ff9-config\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.265670 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-config-data\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.265694 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7aee525-4be4-45af-9c7c-543f25591ff9-ovsdbserver-sb\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.265747 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e7aee525-4be4-45af-9c7c-543f25591ff9-openstack-edpm-ipam\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.265775 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-scripts\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.266464 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-logs\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.267481 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7aee525-4be4-45af-9c7c-543f25591ff9-ovsdbserver-nb\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.268191 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7aee525-4be4-45af-9c7c-543f25591ff9-dns-svc\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.270398 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7aee525-4be4-45af-9c7c-543f25591ff9-config\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.274141 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7aee525-4be4-45af-9c7c-543f25591ff9-ovsdbserver-sb\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.274227 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-etc-machine-id\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.275020 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e7aee525-4be4-45af-9c7c-543f25591ff9-openstack-edpm-ipam\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.277901 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-scripts\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.278044 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.288531 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-config-data-custom\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.289345 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-config-data\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.293038 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw6dv\" (UniqueName: \"kubernetes.io/projected/e7aee525-4be4-45af-9c7c-543f25591ff9-kube-api-access-qw6dv\") pod \"dnsmasq-dns-69655fd4bf-5tqkp\" (UID: \"e7aee525-4be4-45af-9c7c-543f25591ff9\") " pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.298106 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwtnm\" (UniqueName: \"kubernetes.io/projected/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-kube-api-access-xwtnm\") pod \"manila-api-0\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.364393 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.454304 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.865518 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 31 10:14:37 crc kubenswrapper[4992]: W0131 10:14:37.921194 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60b04c6c_582c_481e_899e_c2df2f6229d3.slice/crio-ad1f09728461f60d18a16b1a5b7a98de207afb25554364781598e7b3dfdc279f WatchSource:0}: Error finding container ad1f09728461f60d18a16b1a5b7a98de207afb25554364781598e7b3dfdc279f: Status 404 returned error can't find the container with id ad1f09728461f60d18a16b1a5b7a98de207afb25554364781598e7b3dfdc279f Jan 31 10:14:37 crc kubenswrapper[4992]: I0131 10:14:37.943892 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 31 10:14:38 crc kubenswrapper[4992]: I0131 10:14:38.071444 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69655fd4bf-5tqkp"] Jan 31 10:14:38 crc kubenswrapper[4992]: W0131 10:14:38.075948 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7aee525_4be4_45af_9c7c_543f25591ff9.slice/crio-54daad5f4783b1c18c93b60875dab38b31815d1d00709c3c56e8762811bd8a42 WatchSource:0}: Error finding container 54daad5f4783b1c18c93b60875dab38b31815d1d00709c3c56e8762811bd8a42: Status 404 returned error can't find the container with id 54daad5f4783b1c18c93b60875dab38b31815d1d00709c3c56e8762811bd8a42 Jan 31 10:14:38 crc kubenswrapper[4992]: I0131 10:14:38.298638 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 31 10:14:38 crc kubenswrapper[4992]: W0131 10:14:38.299803 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40dbaad7_0ead_44b8_9d0e_242b1aa76f5a.slice/crio-b6daec82c6a2878efb00388724b7e504491729819470ac874b0cfa5435745f39 WatchSource:0}: Error finding container b6daec82c6a2878efb00388724b7e504491729819470ac874b0cfa5435745f39: Status 404 returned error can't find the container with id b6daec82c6a2878efb00388724b7e504491729819470ac874b0cfa5435745f39 Jan 31 10:14:38 crc kubenswrapper[4992]: I0131 10:14:38.556403 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a","Type":"ContainerStarted","Data":"b6daec82c6a2878efb00388724b7e504491729819470ac874b0cfa5435745f39"} Jan 31 10:14:38 crc kubenswrapper[4992]: I0131 10:14:38.558623 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d19358db-cbfe-43cc-8305-b2526c0c8fd2","Type":"ContainerStarted","Data":"14e4dec045029eadda6909b82e5f5e4d4b224ff1e449eb7348a75135d1f1f4fa"} Jan 31 10:14:38 crc kubenswrapper[4992]: I0131 10:14:38.560483 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"60b04c6c-582c-481e-899e-c2df2f6229d3","Type":"ContainerStarted","Data":"ad1f09728461f60d18a16b1a5b7a98de207afb25554364781598e7b3dfdc279f"} Jan 31 10:14:38 crc kubenswrapper[4992]: I0131 10:14:38.562376 4992 generic.go:334] "Generic (PLEG): container finished" podID="e7aee525-4be4-45af-9c7c-543f25591ff9" containerID="ad4b18901e78c8f248ea9d6857f5a205d1d4b70276a4e47ad945f75a23699ccc" exitCode=0 Jan 31 10:14:38 crc kubenswrapper[4992]: I0131 10:14:38.562433 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" event={"ID":"e7aee525-4be4-45af-9c7c-543f25591ff9","Type":"ContainerDied","Data":"ad4b18901e78c8f248ea9d6857f5a205d1d4b70276a4e47ad945f75a23699ccc"} Jan 31 10:14:38 crc kubenswrapper[4992]: I0131 10:14:38.562462 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" event={"ID":"e7aee525-4be4-45af-9c7c-543f25591ff9","Type":"ContainerStarted","Data":"54daad5f4783b1c18c93b60875dab38b31815d1d00709c3c56e8762811bd8a42"} Jan 31 10:14:39 crc kubenswrapper[4992]: I0131 10:14:39.575100 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" event={"ID":"e7aee525-4be4-45af-9c7c-543f25591ff9","Type":"ContainerStarted","Data":"c95f67359a2566af2818f8a74c6f8c0bc6a220ec8e43317289fb4945f0f2eded"} Jan 31 10:14:39 crc kubenswrapper[4992]: I0131 10:14:39.575389 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:39 crc kubenswrapper[4992]: I0131 10:14:39.578929 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a","Type":"ContainerStarted","Data":"92242e51a227f416f994fcdf38c1898a5759587a0329f82526c495b8832a1002"} Jan 31 10:14:39 crc kubenswrapper[4992]: I0131 10:14:39.578960 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a","Type":"ContainerStarted","Data":"d2d517af5511a6429bc29ecd1039ada40df8499de694486c71ed75bc67982ef3"} Jan 31 10:14:39 crc kubenswrapper[4992]: I0131 10:14:39.579662 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Jan 31 10:14:39 crc kubenswrapper[4992]: I0131 10:14:39.581366 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d19358db-cbfe-43cc-8305-b2526c0c8fd2","Type":"ContainerStarted","Data":"60c04f4ae1dadd685606d0389e02c44ce02af1ef8491b940daf265104d504641"} Jan 31 10:14:39 crc kubenswrapper[4992]: I0131 10:14:39.601473 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" podStartSLOduration=3.601454062 podStartE2EDuration="3.601454062s" podCreationTimestamp="2026-01-31 10:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 10:14:39.597369164 +0000 UTC m=+2975.568761151" watchObservedRunningTime="2026-01-31 10:14:39.601454062 +0000 UTC m=+2975.572846049" Jan 31 10:14:39 crc kubenswrapper[4992]: I0131 10:14:39.622716 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=2.622696602 podStartE2EDuration="2.622696602s" podCreationTimestamp="2026-01-31 10:14:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 10:14:39.621073756 +0000 UTC m=+2975.592465773" watchObservedRunningTime="2026-01-31 10:14:39.622696602 +0000 UTC m=+2975.594088599" Jan 31 10:14:40 crc kubenswrapper[4992]: I0131 10:14:40.018046 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Jan 31 10:14:40 crc kubenswrapper[4992]: I0131 10:14:40.592659 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d19358db-cbfe-43cc-8305-b2526c0c8fd2","Type":"ContainerStarted","Data":"f1911e5661a59cb7708532a13c2463d506ab4101a5e1a464ba6d210ed9ecee59"} Jan 31 10:14:40 crc kubenswrapper[4992]: I0131 10:14:40.613620 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.944940775 podStartE2EDuration="4.613604503s" podCreationTimestamp="2026-01-31 10:14:36 +0000 UTC" firstStartedPulling="2026-01-31 10:14:37.955980339 +0000 UTC m=+2973.927372326" lastFinishedPulling="2026-01-31 10:14:38.624644067 +0000 UTC m=+2974.596036054" observedRunningTime="2026-01-31 10:14:40.610757431 +0000 UTC m=+2976.582149438" watchObservedRunningTime="2026-01-31 10:14:40.613604503 +0000 UTC m=+2976.584996490" Jan 31 10:14:41 crc kubenswrapper[4992]: I0131 10:14:41.599130 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" containerName="manila-api-log" containerID="cri-o://d2d517af5511a6429bc29ecd1039ada40df8499de694486c71ed75bc67982ef3" gracePeriod=30 Jan 31 10:14:41 crc kubenswrapper[4992]: I0131 10:14:41.599199 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" containerName="manila-api" containerID="cri-o://92242e51a227f416f994fcdf38c1898a5759587a0329f82526c495b8832a1002" gracePeriod=30 Jan 31 10:14:42 crc kubenswrapper[4992]: I0131 10:14:42.043620 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 10:14:42 crc kubenswrapper[4992]: I0131 10:14:42.044187 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="ceilometer-central-agent" containerID="cri-o://25f461553e711b8fd0eea5fd5f8dfb1af4c0f6d2cd7def1af3d1291e940fcfdd" gracePeriod=30 Jan 31 10:14:42 crc kubenswrapper[4992]: I0131 10:14:42.044283 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="ceilometer-notification-agent" containerID="cri-o://5914eee67ce969f2ce76a0d54e5f962a003738b2e8d0d1b9a580fe74caa5ac52" gracePeriod=30 Jan 31 10:14:42 crc kubenswrapper[4992]: I0131 10:14:42.044270 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="sg-core" containerID="cri-o://b3ce0be0f6382442cb59b5c72fb4344a0e99b50335ff474fed060208118d214b" gracePeriod=30 Jan 31 10:14:42 crc kubenswrapper[4992]: I0131 10:14:42.044289 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="proxy-httpd" containerID="cri-o://7e8aa85ebc8c634b4478299a2b4601bb2481a63865048dc5b4a72a7cdb93ceab" gracePeriod=30 Jan 31 10:14:42 crc kubenswrapper[4992]: I0131 10:14:42.612213 4992 generic.go:334] "Generic (PLEG): container finished" podID="40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" containerID="92242e51a227f416f994fcdf38c1898a5759587a0329f82526c495b8832a1002" exitCode=0 Jan 31 10:14:42 crc kubenswrapper[4992]: I0131 10:14:42.612245 4992 generic.go:334] "Generic (PLEG): container finished" podID="40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" containerID="d2d517af5511a6429bc29ecd1039ada40df8499de694486c71ed75bc67982ef3" exitCode=143 Jan 31 10:14:42 crc kubenswrapper[4992]: I0131 10:14:42.612280 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a","Type":"ContainerDied","Data":"92242e51a227f416f994fcdf38c1898a5759587a0329f82526c495b8832a1002"} Jan 31 10:14:42 crc kubenswrapper[4992]: I0131 10:14:42.612305 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a","Type":"ContainerDied","Data":"d2d517af5511a6429bc29ecd1039ada40df8499de694486c71ed75bc67982ef3"} Jan 31 10:14:42 crc kubenswrapper[4992]: I0131 10:14:42.615296 4992 generic.go:334] "Generic (PLEG): container finished" podID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerID="7e8aa85ebc8c634b4478299a2b4601bb2481a63865048dc5b4a72a7cdb93ceab" exitCode=0 Jan 31 10:14:42 crc kubenswrapper[4992]: I0131 10:14:42.615315 4992 generic.go:334] "Generic (PLEG): container finished" podID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerID="b3ce0be0f6382442cb59b5c72fb4344a0e99b50335ff474fed060208118d214b" exitCode=2 Jan 31 10:14:42 crc kubenswrapper[4992]: I0131 10:14:42.615322 4992 generic.go:334] "Generic (PLEG): container finished" podID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerID="25f461553e711b8fd0eea5fd5f8dfb1af4c0f6d2cd7def1af3d1291e940fcfdd" exitCode=0 Jan 31 10:14:42 crc kubenswrapper[4992]: I0131 10:14:42.615335 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eca84ee-5e16-4802-b7d5-88df97d8787b","Type":"ContainerDied","Data":"7e8aa85ebc8c634b4478299a2b4601bb2481a63865048dc5b4a72a7cdb93ceab"} Jan 31 10:14:42 crc kubenswrapper[4992]: I0131 10:14:42.615349 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eca84ee-5e16-4802-b7d5-88df97d8787b","Type":"ContainerDied","Data":"b3ce0be0f6382442cb59b5c72fb4344a0e99b50335ff474fed060208118d214b"} Jan 31 10:14:42 crc kubenswrapper[4992]: I0131 10:14:42.615361 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eca84ee-5e16-4802-b7d5-88df97d8787b","Type":"ContainerDied","Data":"25f461553e711b8fd0eea5fd5f8dfb1af4c0f6d2cd7def1af3d1291e940fcfdd"} Jan 31 10:14:43 crc kubenswrapper[4992]: I0131 10:14:43.628678 4992 generic.go:334] "Generic (PLEG): container finished" podID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerID="5914eee67ce969f2ce76a0d54e5f962a003738b2e8d0d1b9a580fe74caa5ac52" exitCode=0 Jan 31 10:14:43 crc kubenswrapper[4992]: I0131 10:14:43.628823 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eca84ee-5e16-4802-b7d5-88df97d8787b","Type":"ContainerDied","Data":"5914eee67ce969f2ce76a0d54e5f962a003738b2e8d0d1b9a580fe74caa5ac52"} Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.211645 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.301083 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.301158 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.301236 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.305253 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.305334 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" gracePeriod=600 Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.386790 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-config-data-custom\") pod \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.387230 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-combined-ca-bundle\") pod \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.387263 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-scripts\") pod \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.387290 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwtnm\" (UniqueName: \"kubernetes.io/projected/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-kube-api-access-xwtnm\") pod \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.387319 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-logs\") pod \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.387503 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-etc-machine-id\") pod \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.387596 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-config-data\") pod \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\" (UID: \"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a\") " Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.394061 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" (UID: "40dbaad7-0ead-44b8-9d0e-242b1aa76f5a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.394144 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" (UID: "40dbaad7-0ead-44b8-9d0e-242b1aa76f5a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.394271 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-logs" (OuterVolumeSpecName: "logs") pod "40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" (UID: "40dbaad7-0ead-44b8-9d0e-242b1aa76f5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.394608 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-scripts" (OuterVolumeSpecName: "scripts") pod "40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" (UID: "40dbaad7-0ead-44b8-9d0e-242b1aa76f5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.397791 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-kube-api-access-xwtnm" (OuterVolumeSpecName: "kube-api-access-xwtnm") pod "40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" (UID: "40dbaad7-0ead-44b8-9d0e-242b1aa76f5a"). InnerVolumeSpecName "kube-api-access-xwtnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.424227 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" (UID: "40dbaad7-0ead-44b8-9d0e-242b1aa76f5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.446305 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-config-data" (OuterVolumeSpecName: "config-data") pod "40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" (UID: "40dbaad7-0ead-44b8-9d0e-242b1aa76f5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.446915 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 10:14:45 crc kubenswrapper[4992]: E0131 10:14:45.474626 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.489815 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.489842 4992 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.489851 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.489859 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.489868 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwtnm\" (UniqueName: \"kubernetes.io/projected/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-kube-api-access-xwtnm\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.489877 4992 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-logs\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.489886 4992 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.591076 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-combined-ca-bundle\") pod \"8eca84ee-5e16-4802-b7d5-88df97d8787b\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.591124 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eca84ee-5e16-4802-b7d5-88df97d8787b-run-httpd\") pod \"8eca84ee-5e16-4802-b7d5-88df97d8787b\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.591223 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eca84ee-5e16-4802-b7d5-88df97d8787b-log-httpd\") pod \"8eca84ee-5e16-4802-b7d5-88df97d8787b\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.591274 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-config-data\") pod \"8eca84ee-5e16-4802-b7d5-88df97d8787b\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.591325 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t72vx\" (UniqueName: \"kubernetes.io/projected/8eca84ee-5e16-4802-b7d5-88df97d8787b-kube-api-access-t72vx\") pod \"8eca84ee-5e16-4802-b7d5-88df97d8787b\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.591367 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-scripts\") pod \"8eca84ee-5e16-4802-b7d5-88df97d8787b\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.591404 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-ceilometer-tls-certs\") pod \"8eca84ee-5e16-4802-b7d5-88df97d8787b\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.591720 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eca84ee-5e16-4802-b7d5-88df97d8787b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8eca84ee-5e16-4802-b7d5-88df97d8787b" (UID: "8eca84ee-5e16-4802-b7d5-88df97d8787b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.591734 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eca84ee-5e16-4802-b7d5-88df97d8787b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8eca84ee-5e16-4802-b7d5-88df97d8787b" (UID: "8eca84ee-5e16-4802-b7d5-88df97d8787b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.591824 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-sg-core-conf-yaml\") pod \"8eca84ee-5e16-4802-b7d5-88df97d8787b\" (UID: \"8eca84ee-5e16-4802-b7d5-88df97d8787b\") " Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.592375 4992 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eca84ee-5e16-4802-b7d5-88df97d8787b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.592385 4992 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eca84ee-5e16-4802-b7d5-88df97d8787b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.597602 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-scripts" (OuterVolumeSpecName: "scripts") pod "8eca84ee-5e16-4802-b7d5-88df97d8787b" (UID: "8eca84ee-5e16-4802-b7d5-88df97d8787b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.601247 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eca84ee-5e16-4802-b7d5-88df97d8787b-kube-api-access-t72vx" (OuterVolumeSpecName: "kube-api-access-t72vx") pod "8eca84ee-5e16-4802-b7d5-88df97d8787b" (UID: "8eca84ee-5e16-4802-b7d5-88df97d8787b"). InnerVolumeSpecName "kube-api-access-t72vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.628706 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8eca84ee-5e16-4802-b7d5-88df97d8787b" (UID: "8eca84ee-5e16-4802-b7d5-88df97d8787b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.657026 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"60b04c6c-582c-481e-899e-c2df2f6229d3","Type":"ContainerStarted","Data":"4419d810b7105ddcb87a406192e26139c2711846013a97f381dbd08325ee5afa"} Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.666448 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" exitCode=0 Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.666526 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf"} Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.666558 4992 scope.go:117] "RemoveContainer" containerID="dcbcc87f018c90f071ceba0faad9cc5c81ec174891a9b84e3c799f853950d984" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.667173 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:14:45 crc kubenswrapper[4992]: E0131 10:14:45.668010 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.669031 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8eca84ee-5e16-4802-b7d5-88df97d8787b" (UID: "8eca84ee-5e16-4802-b7d5-88df97d8787b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.671835 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"40dbaad7-0ead-44b8-9d0e-242b1aa76f5a","Type":"ContainerDied","Data":"b6daec82c6a2878efb00388724b7e504491729819470ac874b0cfa5435745f39"} Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.671937 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.674762 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eca84ee-5e16-4802-b7d5-88df97d8787b","Type":"ContainerDied","Data":"f592a23950e0a35401e2ad2469d91f5ffa927e5897470ec6221c624f1b70cbf2"} Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.674839 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.704439 4992 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.704771 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t72vx\" (UniqueName: \"kubernetes.io/projected/8eca84ee-5e16-4802-b7d5-88df97d8787b-kube-api-access-t72vx\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.704787 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.704800 4992 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.708649 4992 scope.go:117] "RemoveContainer" containerID="92242e51a227f416f994fcdf38c1898a5759587a0329f82526c495b8832a1002" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.732209 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-config-data" (OuterVolumeSpecName: "config-data") pod "8eca84ee-5e16-4802-b7d5-88df97d8787b" (UID: "8eca84ee-5e16-4802-b7d5-88df97d8787b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.734938 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.744603 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8eca84ee-5e16-4802-b7d5-88df97d8787b" (UID: "8eca84ee-5e16-4802-b7d5-88df97d8787b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.759444 4992 scope.go:117] "RemoveContainer" containerID="d2d517af5511a6429bc29ecd1039ada40df8499de694486c71ed75bc67982ef3" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.762532 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.788103 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.788711 4992 scope.go:117] "RemoveContainer" containerID="7e8aa85ebc8c634b4478299a2b4601bb2481a63865048dc5b4a72a7cdb93ceab" Jan 31 10:14:45 crc kubenswrapper[4992]: E0131 10:14:45.788718 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="sg-core" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.788883 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="sg-core" Jan 31 10:14:45 crc kubenswrapper[4992]: E0131 10:14:45.788909 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="ceilometer-central-agent" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.788919 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="ceilometer-central-agent" Jan 31 10:14:45 crc kubenswrapper[4992]: E0131 10:14:45.788946 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="proxy-httpd" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.788954 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="proxy-httpd" Jan 31 10:14:45 crc kubenswrapper[4992]: E0131 10:14:45.788978 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" containerName="manila-api-log" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.788987 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" containerName="manila-api-log" Jan 31 10:14:45 crc kubenswrapper[4992]: E0131 10:14:45.789010 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" containerName="manila-api" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.789018 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" containerName="manila-api" Jan 31 10:14:45 crc kubenswrapper[4992]: E0131 10:14:45.789041 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="ceilometer-notification-agent" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.789049 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="ceilometer-notification-agent" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.789406 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="ceilometer-notification-agent" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.789446 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="sg-core" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.789483 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" containerName="manila-api" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.789494 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="proxy-httpd" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.789509 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" containerName="ceilometer-central-agent" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.789525 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" containerName="manila-api-log" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.790708 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.794642 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.794722 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.794873 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.813557 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.814759 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eca84ee-5e16-4802-b7d5-88df97d8787b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.823493 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.851882 4992 scope.go:117] "RemoveContainer" containerID="b3ce0be0f6382442cb59b5c72fb4344a0e99b50335ff474fed060208118d214b" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.872628 4992 scope.go:117] "RemoveContainer" containerID="5914eee67ce969f2ce76a0d54e5f962a003738b2e8d0d1b9a580fe74caa5ac52" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.903467 4992 scope.go:117] "RemoveContainer" containerID="25f461553e711b8fd0eea5fd5f8dfb1af4c0f6d2cd7def1af3d1291e940fcfdd" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.916431 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.916496 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42zqf\" (UniqueName: \"kubernetes.io/projected/89e9cdf6-7f56-4752-a725-f93bb5f98009-kube-api-access-42zqf\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.916524 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-config-data\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.916560 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-config-data-custom\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.916582 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-scripts\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.916622 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-internal-tls-certs\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.916644 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89e9cdf6-7f56-4752-a725-f93bb5f98009-etc-machine-id\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.916662 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89e9cdf6-7f56-4752-a725-f93bb5f98009-logs\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:45 crc kubenswrapper[4992]: I0131 10:14:45.916694 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-public-tls-certs\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.018088 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-internal-tls-certs\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.018144 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89e9cdf6-7f56-4752-a725-f93bb5f98009-etc-machine-id\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.018166 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89e9cdf6-7f56-4752-a725-f93bb5f98009-logs\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.018224 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-public-tls-certs\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.018285 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.018335 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42zqf\" (UniqueName: \"kubernetes.io/projected/89e9cdf6-7f56-4752-a725-f93bb5f98009-kube-api-access-42zqf\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.018378 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-config-data\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.018445 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-config-data-custom\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.018470 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-scripts\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.019786 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/89e9cdf6-7f56-4752-a725-f93bb5f98009-etc-machine-id\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.023243 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-config-data\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.026179 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-internal-tls-certs\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.026700 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-config-data-custom\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.027217 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.028508 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89e9cdf6-7f56-4752-a725-f93bb5f98009-logs\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.031493 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-scripts\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.042053 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89e9cdf6-7f56-4752-a725-f93bb5f98009-public-tls-certs\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.053089 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42zqf\" (UniqueName: \"kubernetes.io/projected/89e9cdf6-7f56-4752-a725-f93bb5f98009-kube-api-access-42zqf\") pod \"manila-api-0\" (UID: \"89e9cdf6-7f56-4752-a725-f93bb5f98009\") " pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.118781 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.219563 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.240290 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.260537 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.262817 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.269142 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.272135 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.272372 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.272524 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.443853 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82bsl\" (UniqueName: \"kubernetes.io/projected/9605df0d-7749-48e8-813f-c35000bf34c3-kube-api-access-82bsl\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.443976 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605df0d-7749-48e8-813f-c35000bf34c3-log-httpd\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.444033 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-scripts\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.444060 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-config-data\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.444082 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605df0d-7749-48e8-813f-c35000bf34c3-run-httpd\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.444141 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.444212 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.444248 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.545395 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.545466 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.545511 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82bsl\" (UniqueName: \"kubernetes.io/projected/9605df0d-7749-48e8-813f-c35000bf34c3-kube-api-access-82bsl\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.545576 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605df0d-7749-48e8-813f-c35000bf34c3-log-httpd\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.545613 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-scripts\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.545631 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-config-data\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.545644 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605df0d-7749-48e8-813f-c35000bf34c3-run-httpd\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.545681 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.547798 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605df0d-7749-48e8-813f-c35000bf34c3-log-httpd\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.548159 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605df0d-7749-48e8-813f-c35000bf34c3-run-httpd\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.552733 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.559065 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.560315 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.561114 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-scripts\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.564347 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-config-data\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.566364 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82bsl\" (UniqueName: \"kubernetes.io/projected/9605df0d-7749-48e8-813f-c35000bf34c3-kube-api-access-82bsl\") pod \"ceilometer-0\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.598396 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.693149 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"60b04c6c-582c-481e-899e-c2df2f6229d3","Type":"ContainerStarted","Data":"02cf441f137983923d74c51f94ba8ae632f7efe1f34c01e6a05cddd2647b4a6f"} Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.727210 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.561895065 podStartE2EDuration="10.727187496s" podCreationTimestamp="2026-01-31 10:14:36 +0000 UTC" firstStartedPulling="2026-01-31 10:14:37.92469392 +0000 UTC m=+2973.896085907" lastFinishedPulling="2026-01-31 10:14:45.089986351 +0000 UTC m=+2981.061378338" observedRunningTime="2026-01-31 10:14:46.718128545 +0000 UTC m=+2982.689520562" watchObservedRunningTime="2026-01-31 10:14:46.727187496 +0000 UTC m=+2982.698579493" Jan 31 10:14:46 crc kubenswrapper[4992]: I0131 10:14:46.772583 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 31 10:14:46 crc kubenswrapper[4992]: W0131 10:14:46.776966 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89e9cdf6_7f56_4752_a725_f93bb5f98009.slice/crio-b6f7d4bbbe963bc5edde81b3658bb3d91abbebb4464486f692ccaa8628fccda7 WatchSource:0}: Error finding container b6f7d4bbbe963bc5edde81b3658bb3d91abbebb4464486f692ccaa8628fccda7: Status 404 returned error can't find the container with id b6f7d4bbbe963bc5edde81b3658bb3d91abbebb4464486f692ccaa8628fccda7 Jan 31 10:14:47 crc kubenswrapper[4992]: I0131 10:14:47.079891 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 10:14:47 crc kubenswrapper[4992]: W0131 10:14:47.086617 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9605df0d_7749_48e8_813f_c35000bf34c3.slice/crio-aae5a7db329a342d6bbb948c21b4015b83425d9e5a4c6666fd30d1b448d6687a WatchSource:0}: Error finding container aae5a7db329a342d6bbb948c21b4015b83425d9e5a4c6666fd30d1b448d6687a: Status 404 returned error can't find the container with id aae5a7db329a342d6bbb948c21b4015b83425d9e5a4c6666fd30d1b448d6687a Jan 31 10:14:47 crc kubenswrapper[4992]: I0131 10:14:47.195727 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40dbaad7-0ead-44b8-9d0e-242b1aa76f5a" path="/var/lib/kubelet/pods/40dbaad7-0ead-44b8-9d0e-242b1aa76f5a/volumes" Jan 31 10:14:47 crc kubenswrapper[4992]: I0131 10:14:47.197700 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eca84ee-5e16-4802-b7d5-88df97d8787b" path="/var/lib/kubelet/pods/8eca84ee-5e16-4802-b7d5-88df97d8787b/volumes" Jan 31 10:14:47 crc kubenswrapper[4992]: I0131 10:14:47.212237 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Jan 31 10:14:47 crc kubenswrapper[4992]: I0131 10:14:47.260808 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Jan 31 10:14:47 crc kubenswrapper[4992]: I0131 10:14:47.366610 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69655fd4bf-5tqkp" Jan 31 10:14:47 crc kubenswrapper[4992]: I0131 10:14:47.459120 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-lqx4l"] Jan 31 10:14:47 crc kubenswrapper[4992]: I0131 10:14:47.459385 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" podUID="6dfb628f-f3c1-402c-8ff0-3c52f32003e0" containerName="dnsmasq-dns" containerID="cri-o://5193f7dad3d75f00d9be15d7afc9cab70355890a8643bfd2353d291424d99341" gracePeriod=10 Jan 31 10:14:47 crc kubenswrapper[4992]: I0131 10:14:47.714238 4992 generic.go:334] "Generic (PLEG): container finished" podID="6dfb628f-f3c1-402c-8ff0-3c52f32003e0" containerID="5193f7dad3d75f00d9be15d7afc9cab70355890a8643bfd2353d291424d99341" exitCode=0 Jan 31 10:14:47 crc kubenswrapper[4992]: I0131 10:14:47.714312 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" event={"ID":"6dfb628f-f3c1-402c-8ff0-3c52f32003e0","Type":"ContainerDied","Data":"5193f7dad3d75f00d9be15d7afc9cab70355890a8643bfd2353d291424d99341"} Jan 31 10:14:47 crc kubenswrapper[4992]: I0131 10:14:47.715557 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"89e9cdf6-7f56-4752-a725-f93bb5f98009","Type":"ContainerStarted","Data":"b6f7d4bbbe963bc5edde81b3658bb3d91abbebb4464486f692ccaa8628fccda7"} Jan 31 10:14:47 crc kubenswrapper[4992]: I0131 10:14:47.716528 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605df0d-7749-48e8-813f-c35000bf34c3","Type":"ContainerStarted","Data":"aae5a7db329a342d6bbb948c21b4015b83425d9e5a4c6666fd30d1b448d6687a"} Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.178439 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.292389 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-dns-svc\") pod \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.292483 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-openstack-edpm-ipam\") pod \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.292543 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-config\") pod \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.292609 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-ovsdbserver-sb\") pod \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.292665 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mtnx\" (UniqueName: \"kubernetes.io/projected/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-kube-api-access-5mtnx\") pod \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.293280 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-ovsdbserver-nb\") pod \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\" (UID: \"6dfb628f-f3c1-402c-8ff0-3c52f32003e0\") " Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.309761 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-kube-api-access-5mtnx" (OuterVolumeSpecName: "kube-api-access-5mtnx") pod "6dfb628f-f3c1-402c-8ff0-3c52f32003e0" (UID: "6dfb628f-f3c1-402c-8ff0-3c52f32003e0"). InnerVolumeSpecName "kube-api-access-5mtnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.346920 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-config" (OuterVolumeSpecName: "config") pod "6dfb628f-f3c1-402c-8ff0-3c52f32003e0" (UID: "6dfb628f-f3c1-402c-8ff0-3c52f32003e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.353871 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6dfb628f-f3c1-402c-8ff0-3c52f32003e0" (UID: "6dfb628f-f3c1-402c-8ff0-3c52f32003e0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.354590 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6dfb628f-f3c1-402c-8ff0-3c52f32003e0" (UID: "6dfb628f-f3c1-402c-8ff0-3c52f32003e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.358983 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6dfb628f-f3c1-402c-8ff0-3c52f32003e0" (UID: "6dfb628f-f3c1-402c-8ff0-3c52f32003e0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.360705 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "6dfb628f-f3c1-402c-8ff0-3c52f32003e0" (UID: "6dfb628f-f3c1-402c-8ff0-3c52f32003e0"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.396053 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.396100 4992 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.396113 4992 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.396126 4992 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-config\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.396137 4992 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.396147 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mtnx\" (UniqueName: \"kubernetes.io/projected/6dfb628f-f3c1-402c-8ff0-3c52f32003e0-kube-api-access-5mtnx\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.733483 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.733641 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fbc59fbb7-lqx4l" event={"ID":"6dfb628f-f3c1-402c-8ff0-3c52f32003e0","Type":"ContainerDied","Data":"02f753d3e16a57b754cf55b25f92c6c3fb7bfd977ef41ff630c709692cd6770a"} Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.733975 4992 scope.go:117] "RemoveContainer" containerID="5193f7dad3d75f00d9be15d7afc9cab70355890a8643bfd2353d291424d99341" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.746897 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"89e9cdf6-7f56-4752-a725-f93bb5f98009","Type":"ContainerStarted","Data":"1443962c48aa8914c253b7e99720a3a6f2cc3f07e8d790e414529f7870f744c9"} Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.746940 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"89e9cdf6-7f56-4752-a725-f93bb5f98009","Type":"ContainerStarted","Data":"80771c7c2b00837cd8cfa1b89a9b0bb8c9a9ea8bbcbe06961cb24471090024af"} Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.746980 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.754300 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605df0d-7749-48e8-813f-c35000bf34c3","Type":"ContainerStarted","Data":"29d6d5dd62c5944acda70c14566f49decadb0fe724c0e76e270ad98444fbb556"} Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.779632 4992 scope.go:117] "RemoveContainer" containerID="c54c49ed0068a6cad171837fb3681f86c0ba9b69fce3443480c042df2571dcda" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.782523 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.782496849 podStartE2EDuration="3.782496849s" podCreationTimestamp="2026-01-31 10:14:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 10:14:48.769632859 +0000 UTC m=+2984.741024846" watchObservedRunningTime="2026-01-31 10:14:48.782496849 +0000 UTC m=+2984.753888836" Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.796183 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-lqx4l"] Jan 31 10:14:48 crc kubenswrapper[4992]: I0131 10:14:48.803635 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fbc59fbb7-lqx4l"] Jan 31 10:14:49 crc kubenswrapper[4992]: I0131 10:14:49.193339 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dfb628f-f3c1-402c-8ff0-3c52f32003e0" path="/var/lib/kubelet/pods/6dfb628f-f3c1-402c-8ff0-3c52f32003e0/volumes" Jan 31 10:14:49 crc kubenswrapper[4992]: I0131 10:14:49.766828 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605df0d-7749-48e8-813f-c35000bf34c3","Type":"ContainerStarted","Data":"7d23482278e5cbb4aec5499f42886e0ca3a1a82535a96771dc31e115d5a2c2c1"} Jan 31 10:14:50 crc kubenswrapper[4992]: I0131 10:14:50.750387 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 10:14:50 crc kubenswrapper[4992]: I0131 10:14:50.778734 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605df0d-7749-48e8-813f-c35000bf34c3","Type":"ContainerStarted","Data":"2a7819ba21b2617b0915736678fcc772f581ba2a5986561beabae11703dbe02f"} Jan 31 10:14:52 crc kubenswrapper[4992]: I0131 10:14:52.797110 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605df0d-7749-48e8-813f-c35000bf34c3","Type":"ContainerStarted","Data":"9bafac3ecd74fc679ef64749a41cc05a9c90e22bd13193f32b5d0569ca53b268"} Jan 31 10:14:52 crc kubenswrapper[4992]: I0131 10:14:52.797581 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="ceilometer-notification-agent" containerID="cri-o://7d23482278e5cbb4aec5499f42886e0ca3a1a82535a96771dc31e115d5a2c2c1" gracePeriod=30 Jan 31 10:14:52 crc kubenswrapper[4992]: I0131 10:14:52.797597 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 31 10:14:52 crc kubenswrapper[4992]: I0131 10:14:52.797571 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="sg-core" containerID="cri-o://2a7819ba21b2617b0915736678fcc772f581ba2a5986561beabae11703dbe02f" gracePeriod=30 Jan 31 10:14:52 crc kubenswrapper[4992]: I0131 10:14:52.797557 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="proxy-httpd" containerID="cri-o://9bafac3ecd74fc679ef64749a41cc05a9c90e22bd13193f32b5d0569ca53b268" gracePeriod=30 Jan 31 10:14:52 crc kubenswrapper[4992]: I0131 10:14:52.797257 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="ceilometer-central-agent" containerID="cri-o://29d6d5dd62c5944acda70c14566f49decadb0fe724c0e76e270ad98444fbb556" gracePeriod=30 Jan 31 10:14:52 crc kubenswrapper[4992]: I0131 10:14:52.834009 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.131834528 podStartE2EDuration="6.833984245s" podCreationTimestamp="2026-01-31 10:14:46 +0000 UTC" firstStartedPulling="2026-01-31 10:14:47.089408037 +0000 UTC m=+2983.060800024" lastFinishedPulling="2026-01-31 10:14:51.791557764 +0000 UTC m=+2987.762949741" observedRunningTime="2026-01-31 10:14:52.819067936 +0000 UTC m=+2988.790459943" watchObservedRunningTime="2026-01-31 10:14:52.833984245 +0000 UTC m=+2988.805376272" Jan 31 10:14:53 crc kubenswrapper[4992]: I0131 10:14:53.818597 4992 generic.go:334] "Generic (PLEG): container finished" podID="9605df0d-7749-48e8-813f-c35000bf34c3" containerID="9bafac3ecd74fc679ef64749a41cc05a9c90e22bd13193f32b5d0569ca53b268" exitCode=0 Jan 31 10:14:53 crc kubenswrapper[4992]: I0131 10:14:53.818873 4992 generic.go:334] "Generic (PLEG): container finished" podID="9605df0d-7749-48e8-813f-c35000bf34c3" containerID="2a7819ba21b2617b0915736678fcc772f581ba2a5986561beabae11703dbe02f" exitCode=2 Jan 31 10:14:53 crc kubenswrapper[4992]: I0131 10:14:53.818882 4992 generic.go:334] "Generic (PLEG): container finished" podID="9605df0d-7749-48e8-813f-c35000bf34c3" containerID="7d23482278e5cbb4aec5499f42886e0ca3a1a82535a96771dc31e115d5a2c2c1" exitCode=0 Jan 31 10:14:53 crc kubenswrapper[4992]: I0131 10:14:53.818890 4992 generic.go:334] "Generic (PLEG): container finished" podID="9605df0d-7749-48e8-813f-c35000bf34c3" containerID="29d6d5dd62c5944acda70c14566f49decadb0fe724c0e76e270ad98444fbb556" exitCode=0 Jan 31 10:14:53 crc kubenswrapper[4992]: I0131 10:14:53.818789 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605df0d-7749-48e8-813f-c35000bf34c3","Type":"ContainerDied","Data":"9bafac3ecd74fc679ef64749a41cc05a9c90e22bd13193f32b5d0569ca53b268"} Jan 31 10:14:53 crc kubenswrapper[4992]: I0131 10:14:53.818927 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605df0d-7749-48e8-813f-c35000bf34c3","Type":"ContainerDied","Data":"2a7819ba21b2617b0915736678fcc772f581ba2a5986561beabae11703dbe02f"} Jan 31 10:14:53 crc kubenswrapper[4992]: I0131 10:14:53.818942 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605df0d-7749-48e8-813f-c35000bf34c3","Type":"ContainerDied","Data":"7d23482278e5cbb4aec5499f42886e0ca3a1a82535a96771dc31e115d5a2c2c1"} Jan 31 10:14:53 crc kubenswrapper[4992]: I0131 10:14:53.818954 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605df0d-7749-48e8-813f-c35000bf34c3","Type":"ContainerDied","Data":"29d6d5dd62c5944acda70c14566f49decadb0fe724c0e76e270ad98444fbb556"} Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.101888 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.209500 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-ceilometer-tls-certs\") pod \"9605df0d-7749-48e8-813f-c35000bf34c3\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.209555 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-combined-ca-bundle\") pod \"9605df0d-7749-48e8-813f-c35000bf34c3\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.209677 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-config-data\") pod \"9605df0d-7749-48e8-813f-c35000bf34c3\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.209740 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-sg-core-conf-yaml\") pod \"9605df0d-7749-48e8-813f-c35000bf34c3\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.209803 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605df0d-7749-48e8-813f-c35000bf34c3-log-httpd\") pod \"9605df0d-7749-48e8-813f-c35000bf34c3\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.209851 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605df0d-7749-48e8-813f-c35000bf34c3-run-httpd\") pod \"9605df0d-7749-48e8-813f-c35000bf34c3\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.209920 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-scripts\") pod \"9605df0d-7749-48e8-813f-c35000bf34c3\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.209970 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82bsl\" (UniqueName: \"kubernetes.io/projected/9605df0d-7749-48e8-813f-c35000bf34c3-kube-api-access-82bsl\") pod \"9605df0d-7749-48e8-813f-c35000bf34c3\" (UID: \"9605df0d-7749-48e8-813f-c35000bf34c3\") " Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.210483 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9605df0d-7749-48e8-813f-c35000bf34c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9605df0d-7749-48e8-813f-c35000bf34c3" (UID: "9605df0d-7749-48e8-813f-c35000bf34c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.210654 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9605df0d-7749-48e8-813f-c35000bf34c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9605df0d-7749-48e8-813f-c35000bf34c3" (UID: "9605df0d-7749-48e8-813f-c35000bf34c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.211341 4992 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605df0d-7749-48e8-813f-c35000bf34c3-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.211367 4992 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605df0d-7749-48e8-813f-c35000bf34c3-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.215007 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9605df0d-7749-48e8-813f-c35000bf34c3-kube-api-access-82bsl" (OuterVolumeSpecName: "kube-api-access-82bsl") pod "9605df0d-7749-48e8-813f-c35000bf34c3" (UID: "9605df0d-7749-48e8-813f-c35000bf34c3"). InnerVolumeSpecName "kube-api-access-82bsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.215843 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-scripts" (OuterVolumeSpecName: "scripts") pod "9605df0d-7749-48e8-813f-c35000bf34c3" (UID: "9605df0d-7749-48e8-813f-c35000bf34c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.245101 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9605df0d-7749-48e8-813f-c35000bf34c3" (UID: "9605df0d-7749-48e8-813f-c35000bf34c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.265605 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9605df0d-7749-48e8-813f-c35000bf34c3" (UID: "9605df0d-7749-48e8-813f-c35000bf34c3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.307038 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-config-data" (OuterVolumeSpecName: "config-data") pod "9605df0d-7749-48e8-813f-c35000bf34c3" (UID: "9605df0d-7749-48e8-813f-c35000bf34c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.310176 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9605df0d-7749-48e8-813f-c35000bf34c3" (UID: "9605df0d-7749-48e8-813f-c35000bf34c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.313590 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.313623 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.313632 4992 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.313641 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.313649 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82bsl\" (UniqueName: \"kubernetes.io/projected/9605df0d-7749-48e8-813f-c35000bf34c3-kube-api-access-82bsl\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.313661 4992 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9605df0d-7749-48e8-813f-c35000bf34c3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.832781 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605df0d-7749-48e8-813f-c35000bf34c3","Type":"ContainerDied","Data":"aae5a7db329a342d6bbb948c21b4015b83425d9e5a4c6666fd30d1b448d6687a"} Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.832890 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.833314 4992 scope.go:117] "RemoveContainer" containerID="9bafac3ecd74fc679ef64749a41cc05a9c90e22bd13193f32b5d0569ca53b268" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.878330 4992 scope.go:117] "RemoveContainer" containerID="2a7819ba21b2617b0915736678fcc772f581ba2a5986561beabae11703dbe02f" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.883949 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.893516 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.907657 4992 scope.go:117] "RemoveContainer" containerID="7d23482278e5cbb4aec5499f42886e0ca3a1a82535a96771dc31e115d5a2c2c1" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.913573 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 31 10:14:54 crc kubenswrapper[4992]: E0131 10:14:54.914046 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="ceilometer-central-agent" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.914074 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="ceilometer-central-agent" Jan 31 10:14:54 crc kubenswrapper[4992]: E0131 10:14:54.914100 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfb628f-f3c1-402c-8ff0-3c52f32003e0" containerName="init" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.914108 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfb628f-f3c1-402c-8ff0-3c52f32003e0" containerName="init" Jan 31 10:14:54 crc kubenswrapper[4992]: E0131 10:14:54.914118 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="proxy-httpd" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.914128 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="proxy-httpd" Jan 31 10:14:54 crc kubenswrapper[4992]: E0131 10:14:54.914146 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="sg-core" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.914157 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="sg-core" Jan 31 10:14:54 crc kubenswrapper[4992]: E0131 10:14:54.914179 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="ceilometer-notification-agent" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.914187 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="ceilometer-notification-agent" Jan 31 10:14:54 crc kubenswrapper[4992]: E0131 10:14:54.914214 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfb628f-f3c1-402c-8ff0-3c52f32003e0" containerName="dnsmasq-dns" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.914222 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfb628f-f3c1-402c-8ff0-3c52f32003e0" containerName="dnsmasq-dns" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.914441 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="ceilometer-central-agent" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.914456 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="sg-core" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.914474 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="ceilometer-notification-agent" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.914487 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dfb628f-f3c1-402c-8ff0-3c52f32003e0" containerName="dnsmasq-dns" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.914497 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" containerName="proxy-httpd" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.916387 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.926643 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.927129 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.925556 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.946100 4992 scope.go:117] "RemoveContainer" containerID="29d6d5dd62c5944acda70c14566f49decadb0fe724c0e76e270ad98444fbb556" Jan 31 10:14:54 crc kubenswrapper[4992]: I0131 10:14:54.962763 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.027952 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cad81f-c9ec-406f-81d1-749646e7e81b-config-data\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.028044 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2cad81f-c9ec-406f-81d1-749646e7e81b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.028111 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cad81f-c9ec-406f-81d1-749646e7e81b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.028167 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cad81f-c9ec-406f-81d1-749646e7e81b-scripts\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.028281 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2cad81f-c9ec-406f-81d1-749646e7e81b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.028365 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cad81f-c9ec-406f-81d1-749646e7e81b-log-httpd\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.028460 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77zf5\" (UniqueName: \"kubernetes.io/projected/d2cad81f-c9ec-406f-81d1-749646e7e81b-kube-api-access-77zf5\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.028567 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cad81f-c9ec-406f-81d1-749646e7e81b-run-httpd\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.130287 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cad81f-c9ec-406f-81d1-749646e7e81b-config-data\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.130349 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2cad81f-c9ec-406f-81d1-749646e7e81b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.130381 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cad81f-c9ec-406f-81d1-749646e7e81b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.130411 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cad81f-c9ec-406f-81d1-749646e7e81b-scripts\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.130499 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2cad81f-c9ec-406f-81d1-749646e7e81b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.130543 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cad81f-c9ec-406f-81d1-749646e7e81b-log-httpd\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.130575 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77zf5\" (UniqueName: \"kubernetes.io/projected/d2cad81f-c9ec-406f-81d1-749646e7e81b-kube-api-access-77zf5\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.130625 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cad81f-c9ec-406f-81d1-749646e7e81b-run-httpd\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.131294 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cad81f-c9ec-406f-81d1-749646e7e81b-log-httpd\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.132018 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2cad81f-c9ec-406f-81d1-749646e7e81b-run-httpd\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.136383 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2cad81f-c9ec-406f-81d1-749646e7e81b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.136887 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cad81f-c9ec-406f-81d1-749646e7e81b-config-data\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.137892 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2cad81f-c9ec-406f-81d1-749646e7e81b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.137942 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cad81f-c9ec-406f-81d1-749646e7e81b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.138942 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cad81f-c9ec-406f-81d1-749646e7e81b-scripts\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.166506 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77zf5\" (UniqueName: \"kubernetes.io/projected/d2cad81f-c9ec-406f-81d1-749646e7e81b-kube-api-access-77zf5\") pod \"ceilometer-0\" (UID: \"d2cad81f-c9ec-406f-81d1-749646e7e81b\") " pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.217408 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9605df0d-7749-48e8-813f-c35000bf34c3" path="/var/lib/kubelet/pods/9605df0d-7749-48e8-813f-c35000bf34c3/volumes" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.241795 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.684877 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 31 10:14:55 crc kubenswrapper[4992]: I0131 10:14:55.847643 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2cad81f-c9ec-406f-81d1-749646e7e81b","Type":"ContainerStarted","Data":"93c82700bd82d716dada1860e2bd8988c653b3eb78b3eaf4b53882e16ffc3fa5"} Jan 31 10:14:56 crc kubenswrapper[4992]: I0131 10:14:56.856150 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2cad81f-c9ec-406f-81d1-749646e7e81b","Type":"ContainerStarted","Data":"e471ed300f6e4c86077fe07d1bbd0505decb03f0c2260c6d36e3f6aa0052957d"} Jan 31 10:14:57 crc kubenswrapper[4992]: I0131 10:14:57.182385 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:14:57 crc kubenswrapper[4992]: E0131 10:14:57.182835 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:14:57 crc kubenswrapper[4992]: I0131 10:14:57.866141 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2cad81f-c9ec-406f-81d1-749646e7e81b","Type":"ContainerStarted","Data":"3e09e367a8828168b497a5d43eb0e6e328f84b0b8f2f0d267f4e8389367a3938"} Jan 31 10:14:57 crc kubenswrapper[4992]: I0131 10:14:57.866681 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2cad81f-c9ec-406f-81d1-749646e7e81b","Type":"ContainerStarted","Data":"e8bb64f2146712bec92837d3b413b6e7ee85d32574173c7c4efab7ea78aa581c"} Jan 31 10:14:58 crc kubenswrapper[4992]: I0131 10:14:58.923397 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Jan 31 10:14:58 crc kubenswrapper[4992]: I0131 10:14:58.940845 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Jan 31 10:14:58 crc kubenswrapper[4992]: I0131 10:14:58.999495 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Jan 31 10:14:59 crc kubenswrapper[4992]: I0131 10:14:59.031474 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Jan 31 10:14:59 crc kubenswrapper[4992]: I0131 10:14:59.883782 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2cad81f-c9ec-406f-81d1-749646e7e81b","Type":"ContainerStarted","Data":"42f69991ece297615d79989758e5f4738eed9ca770c536ec358815a3e29b1552"} Jan 31 10:14:59 crc kubenswrapper[4992]: I0131 10:14:59.883925 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="d19358db-cbfe-43cc-8305-b2526c0c8fd2" containerName="manila-scheduler" containerID="cri-o://60c04f4ae1dadd685606d0389e02c44ce02af1ef8491b940daf265104d504641" gracePeriod=30 Jan 31 10:14:59 crc kubenswrapper[4992]: I0131 10:14:59.883984 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="d19358db-cbfe-43cc-8305-b2526c0c8fd2" containerName="probe" containerID="cri-o://f1911e5661a59cb7708532a13c2463d506ab4101a5e1a464ba6d210ed9ecee59" gracePeriod=30 Jan 31 10:14:59 crc kubenswrapper[4992]: I0131 10:14:59.884403 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="60b04c6c-582c-481e-899e-c2df2f6229d3" containerName="probe" containerID="cri-o://02cf441f137983923d74c51f94ba8ae632f7efe1f34c01e6a05cddd2647b4a6f" gracePeriod=30 Jan 31 10:14:59 crc kubenswrapper[4992]: I0131 10:14:59.884378 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="60b04c6c-582c-481e-899e-c2df2f6229d3" containerName="manila-share" containerID="cri-o://4419d810b7105ddcb87a406192e26139c2711846013a97f381dbd08325ee5afa" gracePeriod=30 Jan 31 10:14:59 crc kubenswrapper[4992]: I0131 10:14:59.921273 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.04036258 podStartE2EDuration="5.921252583s" podCreationTimestamp="2026-01-31 10:14:54 +0000 UTC" firstStartedPulling="2026-01-31 10:14:55.697456555 +0000 UTC m=+2991.668848552" lastFinishedPulling="2026-01-31 10:14:59.578346578 +0000 UTC m=+2995.549738555" observedRunningTime="2026-01-31 10:14:59.910801603 +0000 UTC m=+2995.882193600" watchObservedRunningTime="2026-01-31 10:14:59.921252583 +0000 UTC m=+2995.892644570" Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.197839 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225"] Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.199293 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.205608 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.205651 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.216962 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225"] Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.348581 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c67bn\" (UniqueName: \"kubernetes.io/projected/d357b512-28b5-40f4-9839-7224cc8db4d7-kube-api-access-c67bn\") pod \"collect-profiles-29497575-hb225\" (UID: \"d357b512-28b5-40f4-9839-7224cc8db4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.348633 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d357b512-28b5-40f4-9839-7224cc8db4d7-config-volume\") pod \"collect-profiles-29497575-hb225\" (UID: \"d357b512-28b5-40f4-9839-7224cc8db4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.348687 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d357b512-28b5-40f4-9839-7224cc8db4d7-secret-volume\") pod \"collect-profiles-29497575-hb225\" (UID: \"d357b512-28b5-40f4-9839-7224cc8db4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.450897 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c67bn\" (UniqueName: \"kubernetes.io/projected/d357b512-28b5-40f4-9839-7224cc8db4d7-kube-api-access-c67bn\") pod \"collect-profiles-29497575-hb225\" (UID: \"d357b512-28b5-40f4-9839-7224cc8db4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.451031 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d357b512-28b5-40f4-9839-7224cc8db4d7-config-volume\") pod \"collect-profiles-29497575-hb225\" (UID: \"d357b512-28b5-40f4-9839-7224cc8db4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.451094 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d357b512-28b5-40f4-9839-7224cc8db4d7-secret-volume\") pod \"collect-profiles-29497575-hb225\" (UID: \"d357b512-28b5-40f4-9839-7224cc8db4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.452152 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d357b512-28b5-40f4-9839-7224cc8db4d7-config-volume\") pod \"collect-profiles-29497575-hb225\" (UID: \"d357b512-28b5-40f4-9839-7224cc8db4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.457228 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d357b512-28b5-40f4-9839-7224cc8db4d7-secret-volume\") pod \"collect-profiles-29497575-hb225\" (UID: \"d357b512-28b5-40f4-9839-7224cc8db4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.470900 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c67bn\" (UniqueName: \"kubernetes.io/projected/d357b512-28b5-40f4-9839-7224cc8db4d7-kube-api-access-c67bn\") pod \"collect-profiles-29497575-hb225\" (UID: \"d357b512-28b5-40f4-9839-7224cc8db4d7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.523480 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.900240 4992 generic.go:334] "Generic (PLEG): container finished" podID="d19358db-cbfe-43cc-8305-b2526c0c8fd2" containerID="f1911e5661a59cb7708532a13c2463d506ab4101a5e1a464ba6d210ed9ecee59" exitCode=0 Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.900314 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d19358db-cbfe-43cc-8305-b2526c0c8fd2","Type":"ContainerDied","Data":"f1911e5661a59cb7708532a13c2463d506ab4101a5e1a464ba6d210ed9ecee59"} Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.902607 4992 generic.go:334] "Generic (PLEG): container finished" podID="60b04c6c-582c-481e-899e-c2df2f6229d3" containerID="02cf441f137983923d74c51f94ba8ae632f7efe1f34c01e6a05cddd2647b4a6f" exitCode=0 Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.902632 4992 generic.go:334] "Generic (PLEG): container finished" podID="60b04c6c-582c-481e-899e-c2df2f6229d3" containerID="4419d810b7105ddcb87a406192e26139c2711846013a97f381dbd08325ee5afa" exitCode=1 Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.902680 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"60b04c6c-582c-481e-899e-c2df2f6229d3","Type":"ContainerDied","Data":"02cf441f137983923d74c51f94ba8ae632f7efe1f34c01e6a05cddd2647b4a6f"} Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.902719 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"60b04c6c-582c-481e-899e-c2df2f6229d3","Type":"ContainerDied","Data":"4419d810b7105ddcb87a406192e26139c2711846013a97f381dbd08325ee5afa"} Jan 31 10:15:00 crc kubenswrapper[4992]: I0131 10:15:00.903009 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.011940 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225"] Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.239992 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.366969 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-combined-ca-bundle\") pod \"60b04c6c-582c-481e-899e-c2df2f6229d3\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.367281 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-scripts\") pod \"60b04c6c-582c-481e-899e-c2df2f6229d3\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.367361 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnvc7\" (UniqueName: \"kubernetes.io/projected/60b04c6c-582c-481e-899e-c2df2f6229d3-kube-api-access-tnvc7\") pod \"60b04c6c-582c-481e-899e-c2df2f6229d3\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.367506 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-config-data-custom\") pod \"60b04c6c-582c-481e-899e-c2df2f6229d3\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.367549 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/60b04c6c-582c-481e-899e-c2df2f6229d3-ceph\") pod \"60b04c6c-582c-481e-899e-c2df2f6229d3\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.367614 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-config-data\") pod \"60b04c6c-582c-481e-899e-c2df2f6229d3\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.367721 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60b04c6c-582c-481e-899e-c2df2f6229d3-etc-machine-id\") pod \"60b04c6c-582c-481e-899e-c2df2f6229d3\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.367785 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/60b04c6c-582c-481e-899e-c2df2f6229d3-var-lib-manila\") pod \"60b04c6c-582c-481e-899e-c2df2f6229d3\" (UID: \"60b04c6c-582c-481e-899e-c2df2f6229d3\") " Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.368486 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60b04c6c-582c-481e-899e-c2df2f6229d3-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "60b04c6c-582c-481e-899e-c2df2f6229d3" (UID: "60b04c6c-582c-481e-899e-c2df2f6229d3"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.368915 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60b04c6c-582c-481e-899e-c2df2f6229d3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "60b04c6c-582c-481e-899e-c2df2f6229d3" (UID: "60b04c6c-582c-481e-899e-c2df2f6229d3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.373139 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b04c6c-582c-481e-899e-c2df2f6229d3-kube-api-access-tnvc7" (OuterVolumeSpecName: "kube-api-access-tnvc7") pod "60b04c6c-582c-481e-899e-c2df2f6229d3" (UID: "60b04c6c-582c-481e-899e-c2df2f6229d3"). InnerVolumeSpecName "kube-api-access-tnvc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.373353 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b04c6c-582c-481e-899e-c2df2f6229d3-ceph" (OuterVolumeSpecName: "ceph") pod "60b04c6c-582c-481e-899e-c2df2f6229d3" (UID: "60b04c6c-582c-481e-899e-c2df2f6229d3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.373916 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-scripts" (OuterVolumeSpecName: "scripts") pod "60b04c6c-582c-481e-899e-c2df2f6229d3" (UID: "60b04c6c-582c-481e-899e-c2df2f6229d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.375552 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "60b04c6c-582c-481e-899e-c2df2f6229d3" (UID: "60b04c6c-582c-481e-899e-c2df2f6229d3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.443669 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60b04c6c-582c-481e-899e-c2df2f6229d3" (UID: "60b04c6c-582c-481e-899e-c2df2f6229d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.469769 4992 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.469798 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/60b04c6c-582c-481e-899e-c2df2f6229d3-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.469808 4992 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60b04c6c-582c-481e-899e-c2df2f6229d3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.469816 4992 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/60b04c6c-582c-481e-899e-c2df2f6229d3-var-lib-manila\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.469824 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.469832 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.469841 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnvc7\" (UniqueName: \"kubernetes.io/projected/60b04c6c-582c-481e-899e-c2df2f6229d3-kube-api-access-tnvc7\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.510083 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-config-data" (OuterVolumeSpecName: "config-data") pod "60b04c6c-582c-481e-899e-c2df2f6229d3" (UID: "60b04c6c-582c-481e-899e-c2df2f6229d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.572006 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b04c6c-582c-481e-899e-c2df2f6229d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.913302 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"60b04c6c-582c-481e-899e-c2df2f6229d3","Type":"ContainerDied","Data":"ad1f09728461f60d18a16b1a5b7a98de207afb25554364781598e7b3dfdc279f"} Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.913357 4992 scope.go:117] "RemoveContainer" containerID="02cf441f137983923d74c51f94ba8ae632f7efe1f34c01e6a05cddd2647b4a6f" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.913389 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.916088 4992 generic.go:334] "Generic (PLEG): container finished" podID="d357b512-28b5-40f4-9839-7224cc8db4d7" containerID="ab6e98e44dfac57fcd6a08dcc0cf367af80c2538c861fa60fafadf55263e1ffc" exitCode=0 Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.916202 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" event={"ID":"d357b512-28b5-40f4-9839-7224cc8db4d7","Type":"ContainerDied","Data":"ab6e98e44dfac57fcd6a08dcc0cf367af80c2538c861fa60fafadf55263e1ffc"} Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.916229 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" event={"ID":"d357b512-28b5-40f4-9839-7224cc8db4d7","Type":"ContainerStarted","Data":"05aa70eb7a3a970c6b072ad478f9ac400924d6718bb03d8a81ed05ae6b023532"} Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.919618 4992 generic.go:334] "Generic (PLEG): container finished" podID="d19358db-cbfe-43cc-8305-b2526c0c8fd2" containerID="60c04f4ae1dadd685606d0389e02c44ce02af1ef8491b940daf265104d504641" exitCode=0 Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.920000 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d19358db-cbfe-43cc-8305-b2526c0c8fd2","Type":"ContainerDied","Data":"60c04f4ae1dadd685606d0389e02c44ce02af1ef8491b940daf265104d504641"} Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.956398 4992 scope.go:117] "RemoveContainer" containerID="4419d810b7105ddcb87a406192e26139c2711846013a97f381dbd08325ee5afa" Jan 31 10:15:01 crc kubenswrapper[4992]: I0131 10:15:01.978978 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.002970 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.028640 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Jan 31 10:15:02 crc kubenswrapper[4992]: E0131 10:15:02.029007 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b04c6c-582c-481e-899e-c2df2f6229d3" containerName="manila-share" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.029023 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b04c6c-582c-481e-899e-c2df2f6229d3" containerName="manila-share" Jan 31 10:15:02 crc kubenswrapper[4992]: E0131 10:15:02.029036 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b04c6c-582c-481e-899e-c2df2f6229d3" containerName="probe" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.029044 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b04c6c-582c-481e-899e-c2df2f6229d3" containerName="probe" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.029292 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b04c6c-582c-481e-899e-c2df2f6229d3" containerName="probe" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.029564 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b04c6c-582c-481e-899e-c2df2f6229d3" containerName="manila-share" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.030807 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.032384 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.036867 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.069174 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.092298 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lc78\" (UniqueName: \"kubernetes.io/projected/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-kube-api-access-6lc78\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.092358 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.092390 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-ceph\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.092574 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.092626 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.092656 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-scripts\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.092685 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.092749 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-config-data\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.194076 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-config-data\") pod \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.194149 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-combined-ca-bundle\") pod \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.194967 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t265v\" (UniqueName: \"kubernetes.io/projected/d19358db-cbfe-43cc-8305-b2526c0c8fd2-kube-api-access-t265v\") pod \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.195042 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-config-data-custom\") pod \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.195087 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d19358db-cbfe-43cc-8305-b2526c0c8fd2-etc-machine-id\") pod \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.195139 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-scripts\") pod \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\" (UID: \"d19358db-cbfe-43cc-8305-b2526c0c8fd2\") " Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.195224 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d19358db-cbfe-43cc-8305-b2526c0c8fd2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d19358db-cbfe-43cc-8305-b2526c0c8fd2" (UID: "d19358db-cbfe-43cc-8305-b2526c0c8fd2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.195554 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-scripts\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.195587 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.195643 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-config-data\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.195669 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lc78\" (UniqueName: \"kubernetes.io/projected/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-kube-api-access-6lc78\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.195693 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.195722 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-ceph\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.195860 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.195896 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.195961 4992 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d19358db-cbfe-43cc-8305-b2526c0c8fd2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.196003 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.196082 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.199295 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d19358db-cbfe-43cc-8305-b2526c0c8fd2" (UID: "d19358db-cbfe-43cc-8305-b2526c0c8fd2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.199399 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.199586 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-ceph\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.200441 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19358db-cbfe-43cc-8305-b2526c0c8fd2-kube-api-access-t265v" (OuterVolumeSpecName: "kube-api-access-t265v") pod "d19358db-cbfe-43cc-8305-b2526c0c8fd2" (UID: "d19358db-cbfe-43cc-8305-b2526c0c8fd2"). InnerVolumeSpecName "kube-api-access-t265v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.200674 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.200893 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-scripts\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.200992 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-config-data\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.213118 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-scripts" (OuterVolumeSpecName: "scripts") pod "d19358db-cbfe-43cc-8305-b2526c0c8fd2" (UID: "d19358db-cbfe-43cc-8305-b2526c0c8fd2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.216595 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lc78\" (UniqueName: \"kubernetes.io/projected/5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb-kube-api-access-6lc78\") pod \"manila-share-share1-0\" (UID: \"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb\") " pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.255282 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d19358db-cbfe-43cc-8305-b2526c0c8fd2" (UID: "d19358db-cbfe-43cc-8305-b2526c0c8fd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.297846 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-config-data" (OuterVolumeSpecName: "config-data") pod "d19358db-cbfe-43cc-8305-b2526c0c8fd2" (UID: "d19358db-cbfe-43cc-8305-b2526c0c8fd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.298358 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.298383 4992 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.298395 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t265v\" (UniqueName: \"kubernetes.io/projected/d19358db-cbfe-43cc-8305-b2526c0c8fd2-kube-api-access-t265v\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.298404 4992 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.298427 4992 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d19358db-cbfe-43cc-8305-b2526c0c8fd2-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.384512 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.903453 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.936118 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb","Type":"ContainerStarted","Data":"bd6ce5f97ca2126af38be28e82b4b637d132e9cc399d4dbb05bbd5866f9124ea"} Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.941308 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d19358db-cbfe-43cc-8305-b2526c0c8fd2","Type":"ContainerDied","Data":"14e4dec045029eadda6909b82e5f5e4d4b224ff1e449eb7348a75135d1f1f4fa"} Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.941381 4992 scope.go:117] "RemoveContainer" containerID="f1911e5661a59cb7708532a13c2463d506ab4101a5e1a464ba6d210ed9ecee59" Jan 31 10:15:02 crc kubenswrapper[4992]: I0131 10:15:02.941486 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.028163 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.059877 4992 scope.go:117] "RemoveContainer" containerID="60c04f4ae1dadd685606d0389e02c44ce02af1ef8491b940daf265104d504641" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.083271 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.095057 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Jan 31 10:15:03 crc kubenswrapper[4992]: E0131 10:15:03.095484 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19358db-cbfe-43cc-8305-b2526c0c8fd2" containerName="manila-scheduler" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.095499 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19358db-cbfe-43cc-8305-b2526c0c8fd2" containerName="manila-scheduler" Jan 31 10:15:03 crc kubenswrapper[4992]: E0131 10:15:03.095533 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19358db-cbfe-43cc-8305-b2526c0c8fd2" containerName="probe" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.095548 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19358db-cbfe-43cc-8305-b2526c0c8fd2" containerName="probe" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.095822 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d19358db-cbfe-43cc-8305-b2526c0c8fd2" containerName="probe" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.095841 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d19358db-cbfe-43cc-8305-b2526c0c8fd2" containerName="manila-scheduler" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.096877 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.099742 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.109183 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.194824 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60b04c6c-582c-481e-899e-c2df2f6229d3" path="/var/lib/kubelet/pods/60b04c6c-582c-481e-899e-c2df2f6229d3/volumes" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.196071 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d19358db-cbfe-43cc-8305-b2526c0c8fd2" path="/var/lib/kubelet/pods/d19358db-cbfe-43cc-8305-b2526c0c8fd2/volumes" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.221889 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db63f7ec-78d1-4773-a6c3-6b48f02f3017-scripts\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.223154 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db63f7ec-78d1-4773-a6c3-6b48f02f3017-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.223349 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db63f7ec-78d1-4773-a6c3-6b48f02f3017-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.224574 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db63f7ec-78d1-4773-a6c3-6b48f02f3017-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.224806 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vv6d\" (UniqueName: \"kubernetes.io/projected/db63f7ec-78d1-4773-a6c3-6b48f02f3017-kube-api-access-6vv6d\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.224935 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db63f7ec-78d1-4773-a6c3-6b48f02f3017-config-data\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.273064 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.327308 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db63f7ec-78d1-4773-a6c3-6b48f02f3017-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.327402 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db63f7ec-78d1-4773-a6c3-6b48f02f3017-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.327533 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vv6d\" (UniqueName: \"kubernetes.io/projected/db63f7ec-78d1-4773-a6c3-6b48f02f3017-kube-api-access-6vv6d\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.327540 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db63f7ec-78d1-4773-a6c3-6b48f02f3017-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.327659 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db63f7ec-78d1-4773-a6c3-6b48f02f3017-config-data\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.327858 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db63f7ec-78d1-4773-a6c3-6b48f02f3017-scripts\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.328313 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db63f7ec-78d1-4773-a6c3-6b48f02f3017-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.333853 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db63f7ec-78d1-4773-a6c3-6b48f02f3017-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.334435 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db63f7ec-78d1-4773-a6c3-6b48f02f3017-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.337435 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db63f7ec-78d1-4773-a6c3-6b48f02f3017-scripts\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.337661 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db63f7ec-78d1-4773-a6c3-6b48f02f3017-config-data\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.348908 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vv6d\" (UniqueName: \"kubernetes.io/projected/db63f7ec-78d1-4773-a6c3-6b48f02f3017-kube-api-access-6vv6d\") pod \"manila-scheduler-0\" (UID: \"db63f7ec-78d1-4773-a6c3-6b48f02f3017\") " pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.417677 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.430096 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d357b512-28b5-40f4-9839-7224cc8db4d7-secret-volume\") pod \"d357b512-28b5-40f4-9839-7224cc8db4d7\" (UID: \"d357b512-28b5-40f4-9839-7224cc8db4d7\") " Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.430168 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d357b512-28b5-40f4-9839-7224cc8db4d7-config-volume\") pod \"d357b512-28b5-40f4-9839-7224cc8db4d7\" (UID: \"d357b512-28b5-40f4-9839-7224cc8db4d7\") " Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.430191 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c67bn\" (UniqueName: \"kubernetes.io/projected/d357b512-28b5-40f4-9839-7224cc8db4d7-kube-api-access-c67bn\") pod \"d357b512-28b5-40f4-9839-7224cc8db4d7\" (UID: \"d357b512-28b5-40f4-9839-7224cc8db4d7\") " Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.431062 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d357b512-28b5-40f4-9839-7224cc8db4d7-config-volume" (OuterVolumeSpecName: "config-volume") pod "d357b512-28b5-40f4-9839-7224cc8db4d7" (UID: "d357b512-28b5-40f4-9839-7224cc8db4d7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.434633 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d357b512-28b5-40f4-9839-7224cc8db4d7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d357b512-28b5-40f4-9839-7224cc8db4d7" (UID: "d357b512-28b5-40f4-9839-7224cc8db4d7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.436211 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d357b512-28b5-40f4-9839-7224cc8db4d7-kube-api-access-c67bn" (OuterVolumeSpecName: "kube-api-access-c67bn") pod "d357b512-28b5-40f4-9839-7224cc8db4d7" (UID: "d357b512-28b5-40f4-9839-7224cc8db4d7"). InnerVolumeSpecName "kube-api-access-c67bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.532787 4992 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d357b512-28b5-40f4-9839-7224cc8db4d7-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.533184 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c67bn\" (UniqueName: \"kubernetes.io/projected/d357b512-28b5-40f4-9839-7224cc8db4d7-kube-api-access-c67bn\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.533198 4992 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d357b512-28b5-40f4-9839-7224cc8db4d7-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.885765 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 31 10:15:03 crc kubenswrapper[4992]: W0131 10:15:03.888309 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb63f7ec_78d1_4773_a6c3_6b48f02f3017.slice/crio-c0766a681d7f8825682fb406c84971645c960781a76aed99344f3bfe6c106a84 WatchSource:0}: Error finding container c0766a681d7f8825682fb406c84971645c960781a76aed99344f3bfe6c106a84: Status 404 returned error can't find the container with id c0766a681d7f8825682fb406c84971645c960781a76aed99344f3bfe6c106a84 Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.952311 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb","Type":"ContainerStarted","Data":"8a8c7f18cd0516904e85ecf93d5e084de5ee0296c71a5c5d36565ec710cb38b1"} Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.955319 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"db63f7ec-78d1-4773-a6c3-6b48f02f3017","Type":"ContainerStarted","Data":"c0766a681d7f8825682fb406c84971645c960781a76aed99344f3bfe6c106a84"} Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.958431 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" event={"ID":"d357b512-28b5-40f4-9839-7224cc8db4d7","Type":"ContainerDied","Data":"05aa70eb7a3a970c6b072ad478f9ac400924d6718bb03d8a81ed05ae6b023532"} Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.958482 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05aa70eb7a3a970c6b072ad478f9ac400924d6718bb03d8a81ed05ae6b023532" Jan 31 10:15:03 crc kubenswrapper[4992]: I0131 10:15:03.958524 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497575-hb225" Jan 31 10:15:04 crc kubenswrapper[4992]: I0131 10:15:04.360824 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5"] Jan 31 10:15:04 crc kubenswrapper[4992]: I0131 10:15:04.373749 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497530-m2dt5"] Jan 31 10:15:04 crc kubenswrapper[4992]: I0131 10:15:04.975387 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"db63f7ec-78d1-4773-a6c3-6b48f02f3017","Type":"ContainerStarted","Data":"b03b739c1dcdf67a009af6673325befebde44b0f1e8282968f3d5f3a5420265f"} Jan 31 10:15:04 crc kubenswrapper[4992]: I0131 10:15:04.975742 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"db63f7ec-78d1-4773-a6c3-6b48f02f3017","Type":"ContainerStarted","Data":"4dddd06d7b6a76ac95b98a13482655f41fec35678ea58976a0cfa3522c8ef464"} Jan 31 10:15:04 crc kubenswrapper[4992]: I0131 10:15:04.978099 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb","Type":"ContainerStarted","Data":"5b35b5f8776bd186890eeeb6e1bf3e2385a8679416c8c6085e019d595fff5a85"} Jan 31 10:15:04 crc kubenswrapper[4992]: I0131 10:15:04.995975 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=1.995956007 podStartE2EDuration="1.995956007s" podCreationTimestamp="2026-01-31 10:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 10:15:04.991291423 +0000 UTC m=+3000.962683420" watchObservedRunningTime="2026-01-31 10:15:04.995956007 +0000 UTC m=+3000.967348004" Jan 31 10:15:05 crc kubenswrapper[4992]: I0131 10:15:05.016840 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.016821657 podStartE2EDuration="4.016821657s" podCreationTimestamp="2026-01-31 10:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 10:15:05.010963088 +0000 UTC m=+3000.982355085" watchObservedRunningTime="2026-01-31 10:15:05.016821657 +0000 UTC m=+3000.988213644" Jan 31 10:15:05 crc kubenswrapper[4992]: I0131 10:15:05.198505 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2732f7b0-a210-4d8a-82a1-952cabafab5d" path="/var/lib/kubelet/pods/2732f7b0-a210-4d8a-82a1-952cabafab5d/volumes" Jan 31 10:15:07 crc kubenswrapper[4992]: I0131 10:15:07.452905 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Jan 31 10:15:11 crc kubenswrapper[4992]: I0131 10:15:11.182574 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:15:11 crc kubenswrapper[4992]: E0131 10:15:11.183174 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:15:12 crc kubenswrapper[4992]: I0131 10:15:12.385279 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Jan 31 10:15:13 crc kubenswrapper[4992]: I0131 10:15:13.418230 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Jan 31 10:15:19 crc kubenswrapper[4992]: I0131 10:15:19.198787 4992 scope.go:117] "RemoveContainer" containerID="5f0a48802e49e695dfdea7d957e53147c205f4405303ce5f48ab323d4d287758" Jan 31 10:15:23 crc kubenswrapper[4992]: I0131 10:15:23.183539 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:15:23 crc kubenswrapper[4992]: E0131 10:15:23.184873 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:15:23 crc kubenswrapper[4992]: I0131 10:15:23.832580 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Jan 31 10:15:24 crc kubenswrapper[4992]: I0131 10:15:24.956072 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Jan 31 10:15:25 crc kubenswrapper[4992]: I0131 10:15:25.253641 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 31 10:15:35 crc kubenswrapper[4992]: I0131 10:15:35.200277 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:15:35 crc kubenswrapper[4992]: E0131 10:15:35.203213 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:15:50 crc kubenswrapper[4992]: I0131 10:15:50.183032 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:15:50 crc kubenswrapper[4992]: E0131 10:15:50.184036 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:16:04 crc kubenswrapper[4992]: I0131 10:16:04.182852 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:16:04 crc kubenswrapper[4992]: E0131 10:16:04.183613 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:16:04 crc kubenswrapper[4992]: E0131 10:16:04.533884 4992 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.243:56714->38.129.56.243:43563: write tcp 38.129.56.243:56714->38.129.56.243:43563: write: broken pipe Jan 31 10:16:10 crc kubenswrapper[4992]: E0131 10:16:10.705202 4992 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.243:33390->38.129.56.243:43563: write tcp 38.129.56.243:33390->38.129.56.243:43563: write: broken pipe Jan 31 10:16:18 crc kubenswrapper[4992]: I0131 10:16:18.183029 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:16:18 crc kubenswrapper[4992]: E0131 10:16:18.183754 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:16:23 crc kubenswrapper[4992]: I0131 10:16:23.866069 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s00-full"] Jan 31 10:16:23 crc kubenswrapper[4992]: E0131 10:16:23.867335 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d357b512-28b5-40f4-9839-7224cc8db4d7" containerName="collect-profiles" Jan 31 10:16:23 crc kubenswrapper[4992]: I0131 10:16:23.867350 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="d357b512-28b5-40f4-9839-7224cc8db4d7" containerName="collect-profiles" Jan 31 10:16:23 crc kubenswrapper[4992]: I0131 10:16:23.869586 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="d357b512-28b5-40f4-9839-7224cc8db4d7" containerName="collect-profiles" Jan 31 10:16:23 crc kubenswrapper[4992]: I0131 10:16:23.870845 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:23 crc kubenswrapper[4992]: I0131 10:16:23.873461 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 31 10:16:23 crc kubenswrapper[4992]: I0131 10:16:23.873568 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 31 10:16:23 crc kubenswrapper[4992]: I0131 10:16:23.873612 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bsch7" Jan 31 10:16:23 crc kubenswrapper[4992]: I0131 10:16:23.873939 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 31 10:16:23 crc kubenswrapper[4992]: I0131 10:16:23.901527 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-full"] Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.016413 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbrvm\" (UniqueName: \"kubernetes.io/projected/aa081a77-baa5-4663-899c-9738d3904244-kube-api-access-dbrvm\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.016719 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aa081a77-baa5-4663-899c-9738d3904244-openstack-config\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.016787 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aa081a77-baa5-4663-899c-9738d3904244-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.016860 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ssh-key\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.016985 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ceph\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.017238 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-openstack-config-secret\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.017289 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aa081a77-baa5-4663-899c-9738d3904244-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.017316 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa081a77-baa5-4663-899c-9738d3904244-config-data\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.017454 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ca-certs\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.017587 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.119084 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ceph\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.119216 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-openstack-config-secret\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.119246 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aa081a77-baa5-4663-899c-9738d3904244-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.119269 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa081a77-baa5-4663-899c-9738d3904244-config-data\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.119294 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ca-certs\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.119346 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.119385 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbrvm\" (UniqueName: \"kubernetes.io/projected/aa081a77-baa5-4663-899c-9738d3904244-kube-api-access-dbrvm\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.119494 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aa081a77-baa5-4663-899c-9738d3904244-openstack-config\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.119519 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aa081a77-baa5-4663-899c-9738d3904244-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.119553 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ssh-key\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.119723 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.120386 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aa081a77-baa5-4663-899c-9738d3904244-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.121190 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa081a77-baa5-4663-899c-9738d3904244-config-data\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.121906 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aa081a77-baa5-4663-899c-9738d3904244-openstack-config\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.123037 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aa081a77-baa5-4663-899c-9738d3904244-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.128135 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-openstack-config-secret\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.128135 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ceph\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.128488 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ssh-key\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.134409 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ca-certs\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.138574 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbrvm\" (UniqueName: \"kubernetes.io/projected/aa081a77-baa5-4663-899c-9738d3904244-kube-api-access-dbrvm\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.146760 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest-s00-full\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.194008 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.716730 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s00-full"] Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.723765 4992 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 10:16:24 crc kubenswrapper[4992]: I0131 10:16:24.814793 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"aa081a77-baa5-4663-899c-9738d3904244","Type":"ContainerStarted","Data":"31ac382c3d37cd5e3e875b113d7c7daa6f313fbc81b0d7bf121d86ea8d3592cb"} Jan 31 10:16:30 crc kubenswrapper[4992]: I0131 10:16:30.184776 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:16:30 crc kubenswrapper[4992]: E0131 10:16:30.185723 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:16:45 crc kubenswrapper[4992]: I0131 10:16:45.189219 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:16:45 crc kubenswrapper[4992]: E0131 10:16:45.189889 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:16:58 crc kubenswrapper[4992]: E0131 10:16:58.295654 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Jan 31 10:16:58 crc kubenswrapper[4992]: E0131 10:16:58.296165 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dbrvm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest-s00-full_openstack(aa081a77-baa5-4663-899c-9738d3904244): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 10:16:58 crc kubenswrapper[4992]: E0131 10:16:58.297510 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest-s00-full" podUID="aa081a77-baa5-4663-899c-9738d3904244" Jan 31 10:16:59 crc kubenswrapper[4992]: E0131 10:16:59.162827 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest-s00-full" podUID="aa081a77-baa5-4663-899c-9738d3904244" Jan 31 10:16:59 crc kubenswrapper[4992]: I0131 10:16:59.182887 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:16:59 crc kubenswrapper[4992]: E0131 10:16:59.183142 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:17:11 crc kubenswrapper[4992]: I0131 10:17:11.183638 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:17:11 crc kubenswrapper[4992]: E0131 10:17:11.184574 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:17:11 crc kubenswrapper[4992]: I0131 10:17:11.644126 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 31 10:17:13 crc kubenswrapper[4992]: I0131 10:17:13.303953 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"aa081a77-baa5-4663-899c-9738d3904244","Type":"ContainerStarted","Data":"d17c89e86c2f2b45d61b8e5ed7ee0e27a5eb9af7a5cfde48ae56859ba2f19336"} Jan 31 10:17:13 crc kubenswrapper[4992]: I0131 10:17:13.329064 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s00-full" podStartSLOduration=4.41076085 podStartE2EDuration="51.329049391s" podCreationTimestamp="2026-01-31 10:16:22 +0000 UTC" firstStartedPulling="2026-01-31 10:16:24.723481105 +0000 UTC m=+3080.694873102" lastFinishedPulling="2026-01-31 10:17:11.641769656 +0000 UTC m=+3127.613161643" observedRunningTime="2026-01-31 10:17:13.325991223 +0000 UTC m=+3129.297383210" watchObservedRunningTime="2026-01-31 10:17:13.329049391 +0000 UTC m=+3129.300441378" Jan 31 10:17:22 crc kubenswrapper[4992]: I0131 10:17:22.183539 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:17:22 crc kubenswrapper[4992]: E0131 10:17:22.184439 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:17:37 crc kubenswrapper[4992]: I0131 10:17:37.183389 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:17:37 crc kubenswrapper[4992]: E0131 10:17:37.186679 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:17:49 crc kubenswrapper[4992]: I0131 10:17:49.182323 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:17:49 crc kubenswrapper[4992]: E0131 10:17:49.183118 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:18:03 crc kubenswrapper[4992]: I0131 10:18:03.183950 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:18:03 crc kubenswrapper[4992]: E0131 10:18:03.185131 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:18:16 crc kubenswrapper[4992]: I0131 10:18:16.183131 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:18:16 crc kubenswrapper[4992]: E0131 10:18:16.183745 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:18:31 crc kubenswrapper[4992]: I0131 10:18:31.183602 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:18:31 crc kubenswrapper[4992]: E0131 10:18:31.184689 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:18:42 crc kubenswrapper[4992]: I0131 10:18:42.183118 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:18:42 crc kubenswrapper[4992]: E0131 10:18:42.183969 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:18:57 crc kubenswrapper[4992]: I0131 10:18:57.182293 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:18:57 crc kubenswrapper[4992]: E0131 10:18:57.183089 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:19:10 crc kubenswrapper[4992]: I0131 10:19:10.184699 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:19:10 crc kubenswrapper[4992]: E0131 10:19:10.185733 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:19:24 crc kubenswrapper[4992]: I0131 10:19:24.182386 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:19:24 crc kubenswrapper[4992]: E0131 10:19:24.183104 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:19:35 crc kubenswrapper[4992]: I0131 10:19:35.198135 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:19:35 crc kubenswrapper[4992]: E0131 10:19:35.201112 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:19:47 crc kubenswrapper[4992]: I0131 10:19:47.183450 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:19:47 crc kubenswrapper[4992]: I0131 10:19:47.849848 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"c5c91d95f3f64c3ed9c1d8e49f5084209924bbe3a1430c8bb1fda925302d8789"} Jan 31 10:20:00 crc kubenswrapper[4992]: I0131 10:20:00.282003 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2s7f4"] Jan 31 10:20:00 crc kubenswrapper[4992]: I0131 10:20:00.285598 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:00 crc kubenswrapper[4992]: I0131 10:20:00.295194 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2s7f4"] Jan 31 10:20:00 crc kubenswrapper[4992]: I0131 10:20:00.405765 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4md8\" (UniqueName: \"kubernetes.io/projected/b779be27-d604-445b-a95e-a5975737add7-kube-api-access-j4md8\") pod \"redhat-marketplace-2s7f4\" (UID: \"b779be27-d604-445b-a95e-a5975737add7\") " pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:00 crc kubenswrapper[4992]: I0131 10:20:00.405841 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b779be27-d604-445b-a95e-a5975737add7-utilities\") pod \"redhat-marketplace-2s7f4\" (UID: \"b779be27-d604-445b-a95e-a5975737add7\") " pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:00 crc kubenswrapper[4992]: I0131 10:20:00.406156 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b779be27-d604-445b-a95e-a5975737add7-catalog-content\") pod \"redhat-marketplace-2s7f4\" (UID: \"b779be27-d604-445b-a95e-a5975737add7\") " pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:00 crc kubenswrapper[4992]: I0131 10:20:00.508317 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4md8\" (UniqueName: \"kubernetes.io/projected/b779be27-d604-445b-a95e-a5975737add7-kube-api-access-j4md8\") pod \"redhat-marketplace-2s7f4\" (UID: \"b779be27-d604-445b-a95e-a5975737add7\") " pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:00 crc kubenswrapper[4992]: I0131 10:20:00.508443 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b779be27-d604-445b-a95e-a5975737add7-utilities\") pod \"redhat-marketplace-2s7f4\" (UID: \"b779be27-d604-445b-a95e-a5975737add7\") " pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:00 crc kubenswrapper[4992]: I0131 10:20:00.508561 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b779be27-d604-445b-a95e-a5975737add7-catalog-content\") pod \"redhat-marketplace-2s7f4\" (UID: \"b779be27-d604-445b-a95e-a5975737add7\") " pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:00 crc kubenswrapper[4992]: I0131 10:20:00.509166 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b779be27-d604-445b-a95e-a5975737add7-utilities\") pod \"redhat-marketplace-2s7f4\" (UID: \"b779be27-d604-445b-a95e-a5975737add7\") " pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:00 crc kubenswrapper[4992]: I0131 10:20:00.509211 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b779be27-d604-445b-a95e-a5975737add7-catalog-content\") pod \"redhat-marketplace-2s7f4\" (UID: \"b779be27-d604-445b-a95e-a5975737add7\") " pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:00 crc kubenswrapper[4992]: I0131 10:20:00.534485 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4md8\" (UniqueName: \"kubernetes.io/projected/b779be27-d604-445b-a95e-a5975737add7-kube-api-access-j4md8\") pod \"redhat-marketplace-2s7f4\" (UID: \"b779be27-d604-445b-a95e-a5975737add7\") " pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:00 crc kubenswrapper[4992]: I0131 10:20:00.653791 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:01 crc kubenswrapper[4992]: I0131 10:20:01.115350 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2s7f4"] Jan 31 10:20:01 crc kubenswrapper[4992]: I0131 10:20:01.984073 4992 generic.go:334] "Generic (PLEG): container finished" podID="b779be27-d604-445b-a95e-a5975737add7" containerID="0d86681915798be7d14283cd753501dae94af5ca8c096d169a83ac9833770526" exitCode=0 Jan 31 10:20:01 crc kubenswrapper[4992]: I0131 10:20:01.984164 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2s7f4" event={"ID":"b779be27-d604-445b-a95e-a5975737add7","Type":"ContainerDied","Data":"0d86681915798be7d14283cd753501dae94af5ca8c096d169a83ac9833770526"} Jan 31 10:20:01 crc kubenswrapper[4992]: I0131 10:20:01.984361 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2s7f4" event={"ID":"b779be27-d604-445b-a95e-a5975737add7","Type":"ContainerStarted","Data":"e5b78510d337dede94467f8ede80d79ce417305e47e6cdb0d9ea714341e2c09a"} Jan 31 10:20:03 crc kubenswrapper[4992]: I0131 10:20:03.257342 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9xstc"] Jan 31 10:20:03 crc kubenswrapper[4992]: I0131 10:20:03.262960 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:03 crc kubenswrapper[4992]: I0131 10:20:03.289461 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9xstc"] Jan 31 10:20:03 crc kubenswrapper[4992]: I0131 10:20:03.380838 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c1df8cc-c50f-4da5-8235-29986c9b8615-catalog-content\") pod \"redhat-operators-9xstc\" (UID: \"1c1df8cc-c50f-4da5-8235-29986c9b8615\") " pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:03 crc kubenswrapper[4992]: I0131 10:20:03.380943 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw5wp\" (UniqueName: \"kubernetes.io/projected/1c1df8cc-c50f-4da5-8235-29986c9b8615-kube-api-access-nw5wp\") pod \"redhat-operators-9xstc\" (UID: \"1c1df8cc-c50f-4da5-8235-29986c9b8615\") " pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:03 crc kubenswrapper[4992]: I0131 10:20:03.381030 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c1df8cc-c50f-4da5-8235-29986c9b8615-utilities\") pod \"redhat-operators-9xstc\" (UID: \"1c1df8cc-c50f-4da5-8235-29986c9b8615\") " pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:03 crc kubenswrapper[4992]: I0131 10:20:03.483174 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c1df8cc-c50f-4da5-8235-29986c9b8615-catalog-content\") pod \"redhat-operators-9xstc\" (UID: \"1c1df8cc-c50f-4da5-8235-29986c9b8615\") " pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:03 crc kubenswrapper[4992]: I0131 10:20:03.483342 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw5wp\" (UniqueName: \"kubernetes.io/projected/1c1df8cc-c50f-4da5-8235-29986c9b8615-kube-api-access-nw5wp\") pod \"redhat-operators-9xstc\" (UID: \"1c1df8cc-c50f-4da5-8235-29986c9b8615\") " pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:03 crc kubenswrapper[4992]: I0131 10:20:03.483444 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c1df8cc-c50f-4da5-8235-29986c9b8615-utilities\") pod \"redhat-operators-9xstc\" (UID: \"1c1df8cc-c50f-4da5-8235-29986c9b8615\") " pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:03 crc kubenswrapper[4992]: I0131 10:20:03.484150 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c1df8cc-c50f-4da5-8235-29986c9b8615-utilities\") pod \"redhat-operators-9xstc\" (UID: \"1c1df8cc-c50f-4da5-8235-29986c9b8615\") " pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:03 crc kubenswrapper[4992]: I0131 10:20:03.484168 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c1df8cc-c50f-4da5-8235-29986c9b8615-catalog-content\") pod \"redhat-operators-9xstc\" (UID: \"1c1df8cc-c50f-4da5-8235-29986c9b8615\") " pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:03 crc kubenswrapper[4992]: I0131 10:20:03.519083 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw5wp\" (UniqueName: \"kubernetes.io/projected/1c1df8cc-c50f-4da5-8235-29986c9b8615-kube-api-access-nw5wp\") pod \"redhat-operators-9xstc\" (UID: \"1c1df8cc-c50f-4da5-8235-29986c9b8615\") " pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:03 crc kubenswrapper[4992]: I0131 10:20:03.595398 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:04 crc kubenswrapper[4992]: I0131 10:20:04.010828 4992 generic.go:334] "Generic (PLEG): container finished" podID="b779be27-d604-445b-a95e-a5975737add7" containerID="50ea0bfa3a9f1744bfd80282ec83dff359e08b51bbddc21eb1c9d06d7b3558fd" exitCode=0 Jan 31 10:20:04 crc kubenswrapper[4992]: I0131 10:20:04.011182 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2s7f4" event={"ID":"b779be27-d604-445b-a95e-a5975737add7","Type":"ContainerDied","Data":"50ea0bfa3a9f1744bfd80282ec83dff359e08b51bbddc21eb1c9d06d7b3558fd"} Jan 31 10:20:04 crc kubenswrapper[4992]: I0131 10:20:04.295741 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9xstc"] Jan 31 10:20:05 crc kubenswrapper[4992]: I0131 10:20:05.036873 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2s7f4" event={"ID":"b779be27-d604-445b-a95e-a5975737add7","Type":"ContainerStarted","Data":"87176dd52233790371841b71338fd5121417a7e0c357a493a412c354062d53f0"} Jan 31 10:20:05 crc kubenswrapper[4992]: I0131 10:20:05.040815 4992 generic.go:334] "Generic (PLEG): container finished" podID="1c1df8cc-c50f-4da5-8235-29986c9b8615" containerID="55c8ada1efcb0887946c8a52913b5293bf416c1578e11e57127640368a44173f" exitCode=0 Jan 31 10:20:05 crc kubenswrapper[4992]: I0131 10:20:05.040884 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xstc" event={"ID":"1c1df8cc-c50f-4da5-8235-29986c9b8615","Type":"ContainerDied","Data":"55c8ada1efcb0887946c8a52913b5293bf416c1578e11e57127640368a44173f"} Jan 31 10:20:05 crc kubenswrapper[4992]: I0131 10:20:05.040922 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xstc" event={"ID":"1c1df8cc-c50f-4da5-8235-29986c9b8615","Type":"ContainerStarted","Data":"3f5801bb3f2c3a1e7f449923709c685155102a0440fff799ed10df0f6a86a5eb"} Jan 31 10:20:05 crc kubenswrapper[4992]: I0131 10:20:05.081309 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2s7f4" podStartSLOduration=2.610143217 podStartE2EDuration="5.081294482s" podCreationTimestamp="2026-01-31 10:20:00 +0000 UTC" firstStartedPulling="2026-01-31 10:20:01.986993287 +0000 UTC m=+3297.958385304" lastFinishedPulling="2026-01-31 10:20:04.458144582 +0000 UTC m=+3300.429536569" observedRunningTime="2026-01-31 10:20:05.079025267 +0000 UTC m=+3301.050417254" watchObservedRunningTime="2026-01-31 10:20:05.081294482 +0000 UTC m=+3301.052686469" Jan 31 10:20:06 crc kubenswrapper[4992]: I0131 10:20:06.054379 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xstc" event={"ID":"1c1df8cc-c50f-4da5-8235-29986c9b8615","Type":"ContainerStarted","Data":"18392ffd70881ed34ec0447a62268e4f83c5717defdac9f8e237ea4f435f22c9"} Jan 31 10:20:09 crc kubenswrapper[4992]: I0131 10:20:09.085912 4992 generic.go:334] "Generic (PLEG): container finished" podID="1c1df8cc-c50f-4da5-8235-29986c9b8615" containerID="18392ffd70881ed34ec0447a62268e4f83c5717defdac9f8e237ea4f435f22c9" exitCode=0 Jan 31 10:20:09 crc kubenswrapper[4992]: I0131 10:20:09.085997 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xstc" event={"ID":"1c1df8cc-c50f-4da5-8235-29986c9b8615","Type":"ContainerDied","Data":"18392ffd70881ed34ec0447a62268e4f83c5717defdac9f8e237ea4f435f22c9"} Jan 31 10:20:10 crc kubenswrapper[4992]: I0131 10:20:10.653938 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:10 crc kubenswrapper[4992]: I0131 10:20:10.654356 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:10 crc kubenswrapper[4992]: I0131 10:20:10.710695 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:11 crc kubenswrapper[4992]: I0131 10:20:11.114632 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xstc" event={"ID":"1c1df8cc-c50f-4da5-8235-29986c9b8615","Type":"ContainerStarted","Data":"ebbf150be66b9085200fc346085b94d979552f4f0f1be6494fd72172a981b4b7"} Jan 31 10:20:11 crc kubenswrapper[4992]: I0131 10:20:11.143487 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9xstc" podStartSLOduration=3.219222379 podStartE2EDuration="8.143458538s" podCreationTimestamp="2026-01-31 10:20:03 +0000 UTC" firstStartedPulling="2026-01-31 10:20:05.043803365 +0000 UTC m=+3301.015195362" lastFinishedPulling="2026-01-31 10:20:09.968039544 +0000 UTC m=+3305.939431521" observedRunningTime="2026-01-31 10:20:11.139582436 +0000 UTC m=+3307.110974463" watchObservedRunningTime="2026-01-31 10:20:11.143458538 +0000 UTC m=+3307.114850565" Jan 31 10:20:11 crc kubenswrapper[4992]: I0131 10:20:11.197490 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:11 crc kubenswrapper[4992]: I0131 10:20:11.853186 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2s7f4"] Jan 31 10:20:13 crc kubenswrapper[4992]: I0131 10:20:13.130740 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2s7f4" podUID="b779be27-d604-445b-a95e-a5975737add7" containerName="registry-server" containerID="cri-o://87176dd52233790371841b71338fd5121417a7e0c357a493a412c354062d53f0" gracePeriod=2 Jan 31 10:20:13 crc kubenswrapper[4992]: I0131 10:20:13.597149 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:13 crc kubenswrapper[4992]: I0131 10:20:13.597357 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:13 crc kubenswrapper[4992]: I0131 10:20:13.602923 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:13 crc kubenswrapper[4992]: I0131 10:20:13.726396 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b779be27-d604-445b-a95e-a5975737add7-catalog-content\") pod \"b779be27-d604-445b-a95e-a5975737add7\" (UID: \"b779be27-d604-445b-a95e-a5975737add7\") " Jan 31 10:20:13 crc kubenswrapper[4992]: I0131 10:20:13.726536 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4md8\" (UniqueName: \"kubernetes.io/projected/b779be27-d604-445b-a95e-a5975737add7-kube-api-access-j4md8\") pod \"b779be27-d604-445b-a95e-a5975737add7\" (UID: \"b779be27-d604-445b-a95e-a5975737add7\") " Jan 31 10:20:13 crc kubenswrapper[4992]: I0131 10:20:13.726566 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b779be27-d604-445b-a95e-a5975737add7-utilities\") pod \"b779be27-d604-445b-a95e-a5975737add7\" (UID: \"b779be27-d604-445b-a95e-a5975737add7\") " Jan 31 10:20:13 crc kubenswrapper[4992]: I0131 10:20:13.727277 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b779be27-d604-445b-a95e-a5975737add7-utilities" (OuterVolumeSpecName: "utilities") pod "b779be27-d604-445b-a95e-a5975737add7" (UID: "b779be27-d604-445b-a95e-a5975737add7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:20:13 crc kubenswrapper[4992]: I0131 10:20:13.732601 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b779be27-d604-445b-a95e-a5975737add7-kube-api-access-j4md8" (OuterVolumeSpecName: "kube-api-access-j4md8") pod "b779be27-d604-445b-a95e-a5975737add7" (UID: "b779be27-d604-445b-a95e-a5975737add7"). InnerVolumeSpecName "kube-api-access-j4md8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:20:13 crc kubenswrapper[4992]: I0131 10:20:13.748128 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b779be27-d604-445b-a95e-a5975737add7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b779be27-d604-445b-a95e-a5975737add7" (UID: "b779be27-d604-445b-a95e-a5975737add7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:20:13 crc kubenswrapper[4992]: I0131 10:20:13.829544 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b779be27-d604-445b-a95e-a5975737add7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:20:13 crc kubenswrapper[4992]: I0131 10:20:13.830278 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4md8\" (UniqueName: \"kubernetes.io/projected/b779be27-d604-445b-a95e-a5975737add7-kube-api-access-j4md8\") on node \"crc\" DevicePath \"\"" Jan 31 10:20:13 crc kubenswrapper[4992]: I0131 10:20:13.830370 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b779be27-d604-445b-a95e-a5975737add7-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.147463 4992 generic.go:334] "Generic (PLEG): container finished" podID="b779be27-d604-445b-a95e-a5975737add7" containerID="87176dd52233790371841b71338fd5121417a7e0c357a493a412c354062d53f0" exitCode=0 Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.147539 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2s7f4" Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.147532 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2s7f4" event={"ID":"b779be27-d604-445b-a95e-a5975737add7","Type":"ContainerDied","Data":"87176dd52233790371841b71338fd5121417a7e0c357a493a412c354062d53f0"} Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.147943 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2s7f4" event={"ID":"b779be27-d604-445b-a95e-a5975737add7","Type":"ContainerDied","Data":"e5b78510d337dede94467f8ede80d79ce417305e47e6cdb0d9ea714341e2c09a"} Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.147977 4992 scope.go:117] "RemoveContainer" containerID="87176dd52233790371841b71338fd5121417a7e0c357a493a412c354062d53f0" Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.188497 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2s7f4"] Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.190983 4992 scope.go:117] "RemoveContainer" containerID="50ea0bfa3a9f1744bfd80282ec83dff359e08b51bbddc21eb1c9d06d7b3558fd" Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.197330 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2s7f4"] Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.225370 4992 scope.go:117] "RemoveContainer" containerID="0d86681915798be7d14283cd753501dae94af5ca8c096d169a83ac9833770526" Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.278022 4992 scope.go:117] "RemoveContainer" containerID="87176dd52233790371841b71338fd5121417a7e0c357a493a412c354062d53f0" Jan 31 10:20:14 crc kubenswrapper[4992]: E0131 10:20:14.278468 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87176dd52233790371841b71338fd5121417a7e0c357a493a412c354062d53f0\": container with ID starting with 87176dd52233790371841b71338fd5121417a7e0c357a493a412c354062d53f0 not found: ID does not exist" containerID="87176dd52233790371841b71338fd5121417a7e0c357a493a412c354062d53f0" Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.278497 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87176dd52233790371841b71338fd5121417a7e0c357a493a412c354062d53f0"} err="failed to get container status \"87176dd52233790371841b71338fd5121417a7e0c357a493a412c354062d53f0\": rpc error: code = NotFound desc = could not find container \"87176dd52233790371841b71338fd5121417a7e0c357a493a412c354062d53f0\": container with ID starting with 87176dd52233790371841b71338fd5121417a7e0c357a493a412c354062d53f0 not found: ID does not exist" Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.278516 4992 scope.go:117] "RemoveContainer" containerID="50ea0bfa3a9f1744bfd80282ec83dff359e08b51bbddc21eb1c9d06d7b3558fd" Jan 31 10:20:14 crc kubenswrapper[4992]: E0131 10:20:14.278872 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50ea0bfa3a9f1744bfd80282ec83dff359e08b51bbddc21eb1c9d06d7b3558fd\": container with ID starting with 50ea0bfa3a9f1744bfd80282ec83dff359e08b51bbddc21eb1c9d06d7b3558fd not found: ID does not exist" containerID="50ea0bfa3a9f1744bfd80282ec83dff359e08b51bbddc21eb1c9d06d7b3558fd" Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.278913 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ea0bfa3a9f1744bfd80282ec83dff359e08b51bbddc21eb1c9d06d7b3558fd"} err="failed to get container status \"50ea0bfa3a9f1744bfd80282ec83dff359e08b51bbddc21eb1c9d06d7b3558fd\": rpc error: code = NotFound desc = could not find container \"50ea0bfa3a9f1744bfd80282ec83dff359e08b51bbddc21eb1c9d06d7b3558fd\": container with ID starting with 50ea0bfa3a9f1744bfd80282ec83dff359e08b51bbddc21eb1c9d06d7b3558fd not found: ID does not exist" Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.278940 4992 scope.go:117] "RemoveContainer" containerID="0d86681915798be7d14283cd753501dae94af5ca8c096d169a83ac9833770526" Jan 31 10:20:14 crc kubenswrapper[4992]: E0131 10:20:14.279484 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d86681915798be7d14283cd753501dae94af5ca8c096d169a83ac9833770526\": container with ID starting with 0d86681915798be7d14283cd753501dae94af5ca8c096d169a83ac9833770526 not found: ID does not exist" containerID="0d86681915798be7d14283cd753501dae94af5ca8c096d169a83ac9833770526" Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.279544 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d86681915798be7d14283cd753501dae94af5ca8c096d169a83ac9833770526"} err="failed to get container status \"0d86681915798be7d14283cd753501dae94af5ca8c096d169a83ac9833770526\": rpc error: code = NotFound desc = could not find container \"0d86681915798be7d14283cd753501dae94af5ca8c096d169a83ac9833770526\": container with ID starting with 0d86681915798be7d14283cd753501dae94af5ca8c096d169a83ac9833770526 not found: ID does not exist" Jan 31 10:20:14 crc kubenswrapper[4992]: I0131 10:20:14.657395 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9xstc" podUID="1c1df8cc-c50f-4da5-8235-29986c9b8615" containerName="registry-server" probeResult="failure" output=< Jan 31 10:20:14 crc kubenswrapper[4992]: timeout: failed to connect service ":50051" within 1s Jan 31 10:20:14 crc kubenswrapper[4992]: > Jan 31 10:20:15 crc kubenswrapper[4992]: I0131 10:20:15.207751 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b779be27-d604-445b-a95e-a5975737add7" path="/var/lib/kubelet/pods/b779be27-d604-445b-a95e-a5975737add7/volumes" Jan 31 10:20:23 crc kubenswrapper[4992]: I0131 10:20:23.670587 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:23 crc kubenswrapper[4992]: I0131 10:20:23.722551 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:23 crc kubenswrapper[4992]: I0131 10:20:23.919940 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9xstc"] Jan 31 10:20:25 crc kubenswrapper[4992]: I0131 10:20:25.277577 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9xstc" podUID="1c1df8cc-c50f-4da5-8235-29986c9b8615" containerName="registry-server" containerID="cri-o://ebbf150be66b9085200fc346085b94d979552f4f0f1be6494fd72172a981b4b7" gracePeriod=2 Jan 31 10:20:25 crc kubenswrapper[4992]: I0131 10:20:25.773688 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:25 crc kubenswrapper[4992]: I0131 10:20:25.899599 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw5wp\" (UniqueName: \"kubernetes.io/projected/1c1df8cc-c50f-4da5-8235-29986c9b8615-kube-api-access-nw5wp\") pod \"1c1df8cc-c50f-4da5-8235-29986c9b8615\" (UID: \"1c1df8cc-c50f-4da5-8235-29986c9b8615\") " Jan 31 10:20:25 crc kubenswrapper[4992]: I0131 10:20:25.899680 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c1df8cc-c50f-4da5-8235-29986c9b8615-catalog-content\") pod \"1c1df8cc-c50f-4da5-8235-29986c9b8615\" (UID: \"1c1df8cc-c50f-4da5-8235-29986c9b8615\") " Jan 31 10:20:25 crc kubenswrapper[4992]: I0131 10:20:25.899773 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c1df8cc-c50f-4da5-8235-29986c9b8615-utilities\") pod \"1c1df8cc-c50f-4da5-8235-29986c9b8615\" (UID: \"1c1df8cc-c50f-4da5-8235-29986c9b8615\") " Jan 31 10:20:25 crc kubenswrapper[4992]: I0131 10:20:25.901274 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c1df8cc-c50f-4da5-8235-29986c9b8615-utilities" (OuterVolumeSpecName: "utilities") pod "1c1df8cc-c50f-4da5-8235-29986c9b8615" (UID: "1c1df8cc-c50f-4da5-8235-29986c9b8615"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:20:25 crc kubenswrapper[4992]: I0131 10:20:25.907064 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c1df8cc-c50f-4da5-8235-29986c9b8615-kube-api-access-nw5wp" (OuterVolumeSpecName: "kube-api-access-nw5wp") pod "1c1df8cc-c50f-4da5-8235-29986c9b8615" (UID: "1c1df8cc-c50f-4da5-8235-29986c9b8615"). InnerVolumeSpecName "kube-api-access-nw5wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.022265 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw5wp\" (UniqueName: \"kubernetes.io/projected/1c1df8cc-c50f-4da5-8235-29986c9b8615-kube-api-access-nw5wp\") on node \"crc\" DevicePath \"\"" Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.022570 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c1df8cc-c50f-4da5-8235-29986c9b8615-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.046753 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c1df8cc-c50f-4da5-8235-29986c9b8615-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c1df8cc-c50f-4da5-8235-29986c9b8615" (UID: "1c1df8cc-c50f-4da5-8235-29986c9b8615"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.123989 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c1df8cc-c50f-4da5-8235-29986c9b8615-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.289688 4992 generic.go:334] "Generic (PLEG): container finished" podID="1c1df8cc-c50f-4da5-8235-29986c9b8615" containerID="ebbf150be66b9085200fc346085b94d979552f4f0f1be6494fd72172a981b4b7" exitCode=0 Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.289736 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xstc" event={"ID":"1c1df8cc-c50f-4da5-8235-29986c9b8615","Type":"ContainerDied","Data":"ebbf150be66b9085200fc346085b94d979552f4f0f1be6494fd72172a981b4b7"} Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.289767 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9xstc" event={"ID":"1c1df8cc-c50f-4da5-8235-29986c9b8615","Type":"ContainerDied","Data":"3f5801bb3f2c3a1e7f449923709c685155102a0440fff799ed10df0f6a86a5eb"} Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.289788 4992 scope.go:117] "RemoveContainer" containerID="ebbf150be66b9085200fc346085b94d979552f4f0f1be6494fd72172a981b4b7" Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.289929 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9xstc" Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.339663 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9xstc"] Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.344279 4992 scope.go:117] "RemoveContainer" containerID="18392ffd70881ed34ec0447a62268e4f83c5717defdac9f8e237ea4f435f22c9" Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.348912 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9xstc"] Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.376844 4992 scope.go:117] "RemoveContainer" containerID="55c8ada1efcb0887946c8a52913b5293bf416c1578e11e57127640368a44173f" Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.420189 4992 scope.go:117] "RemoveContainer" containerID="ebbf150be66b9085200fc346085b94d979552f4f0f1be6494fd72172a981b4b7" Jan 31 10:20:26 crc kubenswrapper[4992]: E0131 10:20:26.420906 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebbf150be66b9085200fc346085b94d979552f4f0f1be6494fd72172a981b4b7\": container with ID starting with ebbf150be66b9085200fc346085b94d979552f4f0f1be6494fd72172a981b4b7 not found: ID does not exist" containerID="ebbf150be66b9085200fc346085b94d979552f4f0f1be6494fd72172a981b4b7" Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.420966 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebbf150be66b9085200fc346085b94d979552f4f0f1be6494fd72172a981b4b7"} err="failed to get container status \"ebbf150be66b9085200fc346085b94d979552f4f0f1be6494fd72172a981b4b7\": rpc error: code = NotFound desc = could not find container \"ebbf150be66b9085200fc346085b94d979552f4f0f1be6494fd72172a981b4b7\": container with ID starting with ebbf150be66b9085200fc346085b94d979552f4f0f1be6494fd72172a981b4b7 not found: ID does not exist" Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.421002 4992 scope.go:117] "RemoveContainer" containerID="18392ffd70881ed34ec0447a62268e4f83c5717defdac9f8e237ea4f435f22c9" Jan 31 10:20:26 crc kubenswrapper[4992]: E0131 10:20:26.421373 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18392ffd70881ed34ec0447a62268e4f83c5717defdac9f8e237ea4f435f22c9\": container with ID starting with 18392ffd70881ed34ec0447a62268e4f83c5717defdac9f8e237ea4f435f22c9 not found: ID does not exist" containerID="18392ffd70881ed34ec0447a62268e4f83c5717defdac9f8e237ea4f435f22c9" Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.421447 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18392ffd70881ed34ec0447a62268e4f83c5717defdac9f8e237ea4f435f22c9"} err="failed to get container status \"18392ffd70881ed34ec0447a62268e4f83c5717defdac9f8e237ea4f435f22c9\": rpc error: code = NotFound desc = could not find container \"18392ffd70881ed34ec0447a62268e4f83c5717defdac9f8e237ea4f435f22c9\": container with ID starting with 18392ffd70881ed34ec0447a62268e4f83c5717defdac9f8e237ea4f435f22c9 not found: ID does not exist" Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.421486 4992 scope.go:117] "RemoveContainer" containerID="55c8ada1efcb0887946c8a52913b5293bf416c1578e11e57127640368a44173f" Jan 31 10:20:26 crc kubenswrapper[4992]: E0131 10:20:26.421911 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55c8ada1efcb0887946c8a52913b5293bf416c1578e11e57127640368a44173f\": container with ID starting with 55c8ada1efcb0887946c8a52913b5293bf416c1578e11e57127640368a44173f not found: ID does not exist" containerID="55c8ada1efcb0887946c8a52913b5293bf416c1578e11e57127640368a44173f" Jan 31 10:20:26 crc kubenswrapper[4992]: I0131 10:20:26.422047 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55c8ada1efcb0887946c8a52913b5293bf416c1578e11e57127640368a44173f"} err="failed to get container status \"55c8ada1efcb0887946c8a52913b5293bf416c1578e11e57127640368a44173f\": rpc error: code = NotFound desc = could not find container \"55c8ada1efcb0887946c8a52913b5293bf416c1578e11e57127640368a44173f\": container with ID starting with 55c8ada1efcb0887946c8a52913b5293bf416c1578e11e57127640368a44173f not found: ID does not exist" Jan 31 10:20:27 crc kubenswrapper[4992]: I0131 10:20:27.198344 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c1df8cc-c50f-4da5-8235-29986c9b8615" path="/var/lib/kubelet/pods/1c1df8cc-c50f-4da5-8235-29986c9b8615/volumes" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.665644 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tj6wf"] Jan 31 10:21:02 crc kubenswrapper[4992]: E0131 10:21:02.666854 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1df8cc-c50f-4da5-8235-29986c9b8615" containerName="registry-server" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.666989 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1df8cc-c50f-4da5-8235-29986c9b8615" containerName="registry-server" Jan 31 10:21:02 crc kubenswrapper[4992]: E0131 10:21:02.667041 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1df8cc-c50f-4da5-8235-29986c9b8615" containerName="extract-utilities" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.667055 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1df8cc-c50f-4da5-8235-29986c9b8615" containerName="extract-utilities" Jan 31 10:21:02 crc kubenswrapper[4992]: E0131 10:21:02.667087 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b779be27-d604-445b-a95e-a5975737add7" containerName="extract-utilities" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.667098 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="b779be27-d604-445b-a95e-a5975737add7" containerName="extract-utilities" Jan 31 10:21:02 crc kubenswrapper[4992]: E0131 10:21:02.667117 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b779be27-d604-445b-a95e-a5975737add7" containerName="registry-server" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.667128 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="b779be27-d604-445b-a95e-a5975737add7" containerName="registry-server" Jan 31 10:21:02 crc kubenswrapper[4992]: E0131 10:21:02.667147 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b779be27-d604-445b-a95e-a5975737add7" containerName="extract-content" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.667157 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="b779be27-d604-445b-a95e-a5975737add7" containerName="extract-content" Jan 31 10:21:02 crc kubenswrapper[4992]: E0131 10:21:02.667175 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1df8cc-c50f-4da5-8235-29986c9b8615" containerName="extract-content" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.667187 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1df8cc-c50f-4da5-8235-29986c9b8615" containerName="extract-content" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.667402 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="b779be27-d604-445b-a95e-a5975737add7" containerName="registry-server" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.667456 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1df8cc-c50f-4da5-8235-29986c9b8615" containerName="registry-server" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.669163 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.686019 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tj6wf"] Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.746187 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c22186a4-a797-49bd-9218-76f88404f4ed-utilities\") pod \"community-operators-tj6wf\" (UID: \"c22186a4-a797-49bd-9218-76f88404f4ed\") " pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.746470 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-824vs\" (UniqueName: \"kubernetes.io/projected/c22186a4-a797-49bd-9218-76f88404f4ed-kube-api-access-824vs\") pod \"community-operators-tj6wf\" (UID: \"c22186a4-a797-49bd-9218-76f88404f4ed\") " pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.746791 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c22186a4-a797-49bd-9218-76f88404f4ed-catalog-content\") pod \"community-operators-tj6wf\" (UID: \"c22186a4-a797-49bd-9218-76f88404f4ed\") " pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.848128 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c22186a4-a797-49bd-9218-76f88404f4ed-catalog-content\") pod \"community-operators-tj6wf\" (UID: \"c22186a4-a797-49bd-9218-76f88404f4ed\") " pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.848209 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c22186a4-a797-49bd-9218-76f88404f4ed-utilities\") pod \"community-operators-tj6wf\" (UID: \"c22186a4-a797-49bd-9218-76f88404f4ed\") " pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.848356 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-824vs\" (UniqueName: \"kubernetes.io/projected/c22186a4-a797-49bd-9218-76f88404f4ed-kube-api-access-824vs\") pod \"community-operators-tj6wf\" (UID: \"c22186a4-a797-49bd-9218-76f88404f4ed\") " pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.848706 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c22186a4-a797-49bd-9218-76f88404f4ed-catalog-content\") pod \"community-operators-tj6wf\" (UID: \"c22186a4-a797-49bd-9218-76f88404f4ed\") " pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.848862 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c22186a4-a797-49bd-9218-76f88404f4ed-utilities\") pod \"community-operators-tj6wf\" (UID: \"c22186a4-a797-49bd-9218-76f88404f4ed\") " pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:02 crc kubenswrapper[4992]: I0131 10:21:02.881220 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-824vs\" (UniqueName: \"kubernetes.io/projected/c22186a4-a797-49bd-9218-76f88404f4ed-kube-api-access-824vs\") pod \"community-operators-tj6wf\" (UID: \"c22186a4-a797-49bd-9218-76f88404f4ed\") " pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:03 crc kubenswrapper[4992]: I0131 10:21:03.006739 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:03 crc kubenswrapper[4992]: W0131 10:21:03.577389 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc22186a4_a797_49bd_9218_76f88404f4ed.slice/crio-812519ab58b1fe96c4b3f6112821ebd100209390579e7afca4c73de56d72eb17 WatchSource:0}: Error finding container 812519ab58b1fe96c4b3f6112821ebd100209390579e7afca4c73de56d72eb17: Status 404 returned error can't find the container with id 812519ab58b1fe96c4b3f6112821ebd100209390579e7afca4c73de56d72eb17 Jan 31 10:21:03 crc kubenswrapper[4992]: I0131 10:21:03.580481 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tj6wf"] Jan 31 10:21:03 crc kubenswrapper[4992]: I0131 10:21:03.675108 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj6wf" event={"ID":"c22186a4-a797-49bd-9218-76f88404f4ed","Type":"ContainerStarted","Data":"812519ab58b1fe96c4b3f6112821ebd100209390579e7afca4c73de56d72eb17"} Jan 31 10:21:04 crc kubenswrapper[4992]: I0131 10:21:04.687270 4992 generic.go:334] "Generic (PLEG): container finished" podID="c22186a4-a797-49bd-9218-76f88404f4ed" containerID="0561fed8b0dfab7dbe41ec1acd2a8cf23924d216174403bccc10b081c6840e63" exitCode=0 Jan 31 10:21:04 crc kubenswrapper[4992]: I0131 10:21:04.687490 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj6wf" event={"ID":"c22186a4-a797-49bd-9218-76f88404f4ed","Type":"ContainerDied","Data":"0561fed8b0dfab7dbe41ec1acd2a8cf23924d216174403bccc10b081c6840e63"} Jan 31 10:21:06 crc kubenswrapper[4992]: I0131 10:21:06.713976 4992 generic.go:334] "Generic (PLEG): container finished" podID="c22186a4-a797-49bd-9218-76f88404f4ed" containerID="ff5b7b34b98bf8c3a11a07353b84e2f8f40fe6a863ccecae08b66c9236199c1d" exitCode=0 Jan 31 10:21:06 crc kubenswrapper[4992]: I0131 10:21:06.714099 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj6wf" event={"ID":"c22186a4-a797-49bd-9218-76f88404f4ed","Type":"ContainerDied","Data":"ff5b7b34b98bf8c3a11a07353b84e2f8f40fe6a863ccecae08b66c9236199c1d"} Jan 31 10:21:07 crc kubenswrapper[4992]: I0131 10:21:07.727336 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj6wf" event={"ID":"c22186a4-a797-49bd-9218-76f88404f4ed","Type":"ContainerStarted","Data":"e016d9d09acc61a3b9d64994dff76d4ebe1aa025506ddfd1b88a9cf45997eed9"} Jan 31 10:21:07 crc kubenswrapper[4992]: I0131 10:21:07.757738 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tj6wf" podStartSLOduration=3.349342676 podStartE2EDuration="5.757716316s" podCreationTimestamp="2026-01-31 10:21:02 +0000 UTC" firstStartedPulling="2026-01-31 10:21:04.689478091 +0000 UTC m=+3360.660870118" lastFinishedPulling="2026-01-31 10:21:07.097851771 +0000 UTC m=+3363.069243758" observedRunningTime="2026-01-31 10:21:07.746689079 +0000 UTC m=+3363.718081086" watchObservedRunningTime="2026-01-31 10:21:07.757716316 +0000 UTC m=+3363.729108313" Jan 31 10:21:13 crc kubenswrapper[4992]: I0131 10:21:13.008589 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:13 crc kubenswrapper[4992]: I0131 10:21:13.008933 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:13 crc kubenswrapper[4992]: I0131 10:21:13.066564 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:13 crc kubenswrapper[4992]: I0131 10:21:13.847293 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:13 crc kubenswrapper[4992]: I0131 10:21:13.898523 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tj6wf"] Jan 31 10:21:15 crc kubenswrapper[4992]: I0131 10:21:15.806119 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tj6wf" podUID="c22186a4-a797-49bd-9218-76f88404f4ed" containerName="registry-server" containerID="cri-o://e016d9d09acc61a3b9d64994dff76d4ebe1aa025506ddfd1b88a9cf45997eed9" gracePeriod=2 Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.361813 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.445718 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-824vs\" (UniqueName: \"kubernetes.io/projected/c22186a4-a797-49bd-9218-76f88404f4ed-kube-api-access-824vs\") pod \"c22186a4-a797-49bd-9218-76f88404f4ed\" (UID: \"c22186a4-a797-49bd-9218-76f88404f4ed\") " Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.446007 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c22186a4-a797-49bd-9218-76f88404f4ed-utilities\") pod \"c22186a4-a797-49bd-9218-76f88404f4ed\" (UID: \"c22186a4-a797-49bd-9218-76f88404f4ed\") " Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.447701 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c22186a4-a797-49bd-9218-76f88404f4ed-utilities" (OuterVolumeSpecName: "utilities") pod "c22186a4-a797-49bd-9218-76f88404f4ed" (UID: "c22186a4-a797-49bd-9218-76f88404f4ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.471261 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c22186a4-a797-49bd-9218-76f88404f4ed-kube-api-access-824vs" (OuterVolumeSpecName: "kube-api-access-824vs") pod "c22186a4-a797-49bd-9218-76f88404f4ed" (UID: "c22186a4-a797-49bd-9218-76f88404f4ed"). InnerVolumeSpecName "kube-api-access-824vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.547739 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c22186a4-a797-49bd-9218-76f88404f4ed-catalog-content\") pod \"c22186a4-a797-49bd-9218-76f88404f4ed\" (UID: \"c22186a4-a797-49bd-9218-76f88404f4ed\") " Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.548275 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-824vs\" (UniqueName: \"kubernetes.io/projected/c22186a4-a797-49bd-9218-76f88404f4ed-kube-api-access-824vs\") on node \"crc\" DevicePath \"\"" Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.548291 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c22186a4-a797-49bd-9218-76f88404f4ed-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.815061 4992 generic.go:334] "Generic (PLEG): container finished" podID="c22186a4-a797-49bd-9218-76f88404f4ed" containerID="e016d9d09acc61a3b9d64994dff76d4ebe1aa025506ddfd1b88a9cf45997eed9" exitCode=0 Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.815113 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj6wf" event={"ID":"c22186a4-a797-49bd-9218-76f88404f4ed","Type":"ContainerDied","Data":"e016d9d09acc61a3b9d64994dff76d4ebe1aa025506ddfd1b88a9cf45997eed9"} Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.815147 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tj6wf" event={"ID":"c22186a4-a797-49bd-9218-76f88404f4ed","Type":"ContainerDied","Data":"812519ab58b1fe96c4b3f6112821ebd100209390579e7afca4c73de56d72eb17"} Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.815166 4992 scope.go:117] "RemoveContainer" containerID="e016d9d09acc61a3b9d64994dff76d4ebe1aa025506ddfd1b88a9cf45997eed9" Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.815314 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tj6wf" Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.844196 4992 scope.go:117] "RemoveContainer" containerID="ff5b7b34b98bf8c3a11a07353b84e2f8f40fe6a863ccecae08b66c9236199c1d" Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.871237 4992 scope.go:117] "RemoveContainer" containerID="0561fed8b0dfab7dbe41ec1acd2a8cf23924d216174403bccc10b081c6840e63" Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.913694 4992 scope.go:117] "RemoveContainer" containerID="e016d9d09acc61a3b9d64994dff76d4ebe1aa025506ddfd1b88a9cf45997eed9" Jan 31 10:21:16 crc kubenswrapper[4992]: E0131 10:21:16.914935 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e016d9d09acc61a3b9d64994dff76d4ebe1aa025506ddfd1b88a9cf45997eed9\": container with ID starting with e016d9d09acc61a3b9d64994dff76d4ebe1aa025506ddfd1b88a9cf45997eed9 not found: ID does not exist" containerID="e016d9d09acc61a3b9d64994dff76d4ebe1aa025506ddfd1b88a9cf45997eed9" Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.915004 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e016d9d09acc61a3b9d64994dff76d4ebe1aa025506ddfd1b88a9cf45997eed9"} err="failed to get container status \"e016d9d09acc61a3b9d64994dff76d4ebe1aa025506ddfd1b88a9cf45997eed9\": rpc error: code = NotFound desc = could not find container \"e016d9d09acc61a3b9d64994dff76d4ebe1aa025506ddfd1b88a9cf45997eed9\": container with ID starting with e016d9d09acc61a3b9d64994dff76d4ebe1aa025506ddfd1b88a9cf45997eed9 not found: ID does not exist" Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.915031 4992 scope.go:117] "RemoveContainer" containerID="ff5b7b34b98bf8c3a11a07353b84e2f8f40fe6a863ccecae08b66c9236199c1d" Jan 31 10:21:16 crc kubenswrapper[4992]: E0131 10:21:16.915489 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff5b7b34b98bf8c3a11a07353b84e2f8f40fe6a863ccecae08b66c9236199c1d\": container with ID starting with ff5b7b34b98bf8c3a11a07353b84e2f8f40fe6a863ccecae08b66c9236199c1d not found: ID does not exist" containerID="ff5b7b34b98bf8c3a11a07353b84e2f8f40fe6a863ccecae08b66c9236199c1d" Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.915570 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff5b7b34b98bf8c3a11a07353b84e2f8f40fe6a863ccecae08b66c9236199c1d"} err="failed to get container status \"ff5b7b34b98bf8c3a11a07353b84e2f8f40fe6a863ccecae08b66c9236199c1d\": rpc error: code = NotFound desc = could not find container \"ff5b7b34b98bf8c3a11a07353b84e2f8f40fe6a863ccecae08b66c9236199c1d\": container with ID starting with ff5b7b34b98bf8c3a11a07353b84e2f8f40fe6a863ccecae08b66c9236199c1d not found: ID does not exist" Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.915614 4992 scope.go:117] "RemoveContainer" containerID="0561fed8b0dfab7dbe41ec1acd2a8cf23924d216174403bccc10b081c6840e63" Jan 31 10:21:16 crc kubenswrapper[4992]: E0131 10:21:16.916035 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0561fed8b0dfab7dbe41ec1acd2a8cf23924d216174403bccc10b081c6840e63\": container with ID starting with 0561fed8b0dfab7dbe41ec1acd2a8cf23924d216174403bccc10b081c6840e63 not found: ID does not exist" containerID="0561fed8b0dfab7dbe41ec1acd2a8cf23924d216174403bccc10b081c6840e63" Jan 31 10:21:16 crc kubenswrapper[4992]: I0131 10:21:16.916068 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0561fed8b0dfab7dbe41ec1acd2a8cf23924d216174403bccc10b081c6840e63"} err="failed to get container status \"0561fed8b0dfab7dbe41ec1acd2a8cf23924d216174403bccc10b081c6840e63\": rpc error: code = NotFound desc = could not find container \"0561fed8b0dfab7dbe41ec1acd2a8cf23924d216174403bccc10b081c6840e63\": container with ID starting with 0561fed8b0dfab7dbe41ec1acd2a8cf23924d216174403bccc10b081c6840e63 not found: ID does not exist" Jan 31 10:21:17 crc kubenswrapper[4992]: I0131 10:21:17.074792 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c22186a4-a797-49bd-9218-76f88404f4ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c22186a4-a797-49bd-9218-76f88404f4ed" (UID: "c22186a4-a797-49bd-9218-76f88404f4ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:21:17 crc kubenswrapper[4992]: I0131 10:21:17.153438 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tj6wf"] Jan 31 10:21:17 crc kubenswrapper[4992]: I0131 10:21:17.159204 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c22186a4-a797-49bd-9218-76f88404f4ed-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:21:17 crc kubenswrapper[4992]: I0131 10:21:17.166088 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tj6wf"] Jan 31 10:21:17 crc kubenswrapper[4992]: I0131 10:21:17.193638 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c22186a4-a797-49bd-9218-76f88404f4ed" path="/var/lib/kubelet/pods/c22186a4-a797-49bd-9218-76f88404f4ed/volumes" Jan 31 10:22:15 crc kubenswrapper[4992]: I0131 10:22:15.332731 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:22:15 crc kubenswrapper[4992]: I0131 10:22:15.333211 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.356612 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-76htj"] Jan 31 10:22:37 crc kubenswrapper[4992]: E0131 10:22:37.357869 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c22186a4-a797-49bd-9218-76f88404f4ed" containerName="extract-content" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.357890 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c22186a4-a797-49bd-9218-76f88404f4ed" containerName="extract-content" Jan 31 10:22:37 crc kubenswrapper[4992]: E0131 10:22:37.357924 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c22186a4-a797-49bd-9218-76f88404f4ed" containerName="extract-utilities" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.357935 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c22186a4-a797-49bd-9218-76f88404f4ed" containerName="extract-utilities" Jan 31 10:22:37 crc kubenswrapper[4992]: E0131 10:22:37.357974 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c22186a4-a797-49bd-9218-76f88404f4ed" containerName="registry-server" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.357986 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="c22186a4-a797-49bd-9218-76f88404f4ed" containerName="registry-server" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.358271 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="c22186a4-a797-49bd-9218-76f88404f4ed" containerName="registry-server" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.360362 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.370842 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-76htj"] Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.445740 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs548\" (UniqueName: \"kubernetes.io/projected/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-kube-api-access-vs548\") pod \"certified-operators-76htj\" (UID: \"28d22c43-d675-4c18-ad9e-fd33d8fea3e9\") " pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.445882 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-catalog-content\") pod \"certified-operators-76htj\" (UID: \"28d22c43-d675-4c18-ad9e-fd33d8fea3e9\") " pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.446016 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-utilities\") pod \"certified-operators-76htj\" (UID: \"28d22c43-d675-4c18-ad9e-fd33d8fea3e9\") " pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.548577 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-utilities\") pod \"certified-operators-76htj\" (UID: \"28d22c43-d675-4c18-ad9e-fd33d8fea3e9\") " pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.548704 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs548\" (UniqueName: \"kubernetes.io/projected/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-kube-api-access-vs548\") pod \"certified-operators-76htj\" (UID: \"28d22c43-d675-4c18-ad9e-fd33d8fea3e9\") " pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.548847 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-catalog-content\") pod \"certified-operators-76htj\" (UID: \"28d22c43-d675-4c18-ad9e-fd33d8fea3e9\") " pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.549584 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-catalog-content\") pod \"certified-operators-76htj\" (UID: \"28d22c43-d675-4c18-ad9e-fd33d8fea3e9\") " pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.549932 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-utilities\") pod \"certified-operators-76htj\" (UID: \"28d22c43-d675-4c18-ad9e-fd33d8fea3e9\") " pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.579084 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs548\" (UniqueName: \"kubernetes.io/projected/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-kube-api-access-vs548\") pod \"certified-operators-76htj\" (UID: \"28d22c43-d675-4c18-ad9e-fd33d8fea3e9\") " pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:37 crc kubenswrapper[4992]: I0131 10:22:37.688389 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:38 crc kubenswrapper[4992]: I0131 10:22:38.166882 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-76htj"] Jan 31 10:22:38 crc kubenswrapper[4992]: W0131 10:22:38.180570 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28d22c43_d675_4c18_ad9e_fd33d8fea3e9.slice/crio-a5aaba29306fe039badc98d60ea3fffea8811236e00dc5ed67e77fb084e834e8 WatchSource:0}: Error finding container a5aaba29306fe039badc98d60ea3fffea8811236e00dc5ed67e77fb084e834e8: Status 404 returned error can't find the container with id a5aaba29306fe039badc98d60ea3fffea8811236e00dc5ed67e77fb084e834e8 Jan 31 10:22:38 crc kubenswrapper[4992]: I0131 10:22:38.688751 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d22c43-d675-4c18-ad9e-fd33d8fea3e9" containerID="94e139f0c99930626d055e47475e517bc5fb26aaedb8e349431e6e63005a06bf" exitCode=0 Jan 31 10:22:38 crc kubenswrapper[4992]: I0131 10:22:38.688812 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76htj" event={"ID":"28d22c43-d675-4c18-ad9e-fd33d8fea3e9","Type":"ContainerDied","Data":"94e139f0c99930626d055e47475e517bc5fb26aaedb8e349431e6e63005a06bf"} Jan 31 10:22:38 crc kubenswrapper[4992]: I0131 10:22:38.689095 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76htj" event={"ID":"28d22c43-d675-4c18-ad9e-fd33d8fea3e9","Type":"ContainerStarted","Data":"a5aaba29306fe039badc98d60ea3fffea8811236e00dc5ed67e77fb084e834e8"} Jan 31 10:22:38 crc kubenswrapper[4992]: I0131 10:22:38.690608 4992 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 10:22:39 crc kubenswrapper[4992]: I0131 10:22:39.703367 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76htj" event={"ID":"28d22c43-d675-4c18-ad9e-fd33d8fea3e9","Type":"ContainerStarted","Data":"6cf09a0c317a6d847d3c64a4f27147d9adf51f9367ff4d986a0d720c7fc94677"} Jan 31 10:22:40 crc kubenswrapper[4992]: I0131 10:22:40.731340 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d22c43-d675-4c18-ad9e-fd33d8fea3e9" containerID="6cf09a0c317a6d847d3c64a4f27147d9adf51f9367ff4d986a0d720c7fc94677" exitCode=0 Jan 31 10:22:40 crc kubenswrapper[4992]: I0131 10:22:40.731747 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76htj" event={"ID":"28d22c43-d675-4c18-ad9e-fd33d8fea3e9","Type":"ContainerDied","Data":"6cf09a0c317a6d847d3c64a4f27147d9adf51f9367ff4d986a0d720c7fc94677"} Jan 31 10:22:40 crc kubenswrapper[4992]: I0131 10:22:40.731784 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76htj" event={"ID":"28d22c43-d675-4c18-ad9e-fd33d8fea3e9","Type":"ContainerStarted","Data":"2bd2c7369ac1452a076ef85762d2db8c0e76c96487902982fd73acdd082c746d"} Jan 31 10:22:40 crc kubenswrapper[4992]: I0131 10:22:40.757392 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-76htj" podStartSLOduration=2.045439716 podStartE2EDuration="3.757368732s" podCreationTimestamp="2026-01-31 10:22:37 +0000 UTC" firstStartedPulling="2026-01-31 10:22:38.690250483 +0000 UTC m=+3454.661642480" lastFinishedPulling="2026-01-31 10:22:40.402179499 +0000 UTC m=+3456.373571496" observedRunningTime="2026-01-31 10:22:40.750770586 +0000 UTC m=+3456.722162613" watchObservedRunningTime="2026-01-31 10:22:40.757368732 +0000 UTC m=+3456.728760739" Jan 31 10:22:45 crc kubenswrapper[4992]: I0131 10:22:45.302724 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:22:45 crc kubenswrapper[4992]: I0131 10:22:45.303338 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:22:47 crc kubenswrapper[4992]: I0131 10:22:47.689602 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:47 crc kubenswrapper[4992]: I0131 10:22:47.690638 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:47 crc kubenswrapper[4992]: I0131 10:22:47.750819 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:47 crc kubenswrapper[4992]: I0131 10:22:47.853255 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:48 crc kubenswrapper[4992]: I0131 10:22:48.010817 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-76htj"] Jan 31 10:22:49 crc kubenswrapper[4992]: I0131 10:22:49.815757 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-76htj" podUID="28d22c43-d675-4c18-ad9e-fd33d8fea3e9" containerName="registry-server" containerID="cri-o://2bd2c7369ac1452a076ef85762d2db8c0e76c96487902982fd73acdd082c746d" gracePeriod=2 Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.355125 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.453351 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs548\" (UniqueName: \"kubernetes.io/projected/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-kube-api-access-vs548\") pod \"28d22c43-d675-4c18-ad9e-fd33d8fea3e9\" (UID: \"28d22c43-d675-4c18-ad9e-fd33d8fea3e9\") " Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.453436 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-catalog-content\") pod \"28d22c43-d675-4c18-ad9e-fd33d8fea3e9\" (UID: \"28d22c43-d675-4c18-ad9e-fd33d8fea3e9\") " Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.453540 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-utilities\") pod \"28d22c43-d675-4c18-ad9e-fd33d8fea3e9\" (UID: \"28d22c43-d675-4c18-ad9e-fd33d8fea3e9\") " Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.454796 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-utilities" (OuterVolumeSpecName: "utilities") pod "28d22c43-d675-4c18-ad9e-fd33d8fea3e9" (UID: "28d22c43-d675-4c18-ad9e-fd33d8fea3e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.459305 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-kube-api-access-vs548" (OuterVolumeSpecName: "kube-api-access-vs548") pod "28d22c43-d675-4c18-ad9e-fd33d8fea3e9" (UID: "28d22c43-d675-4c18-ad9e-fd33d8fea3e9"). InnerVolumeSpecName "kube-api-access-vs548". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.500040 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28d22c43-d675-4c18-ad9e-fd33d8fea3e9" (UID: "28d22c43-d675-4c18-ad9e-fd33d8fea3e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.555750 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs548\" (UniqueName: \"kubernetes.io/projected/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-kube-api-access-vs548\") on node \"crc\" DevicePath \"\"" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.555784 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.555793 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d22c43-d675-4c18-ad9e-fd33d8fea3e9-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.828637 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d22c43-d675-4c18-ad9e-fd33d8fea3e9" containerID="2bd2c7369ac1452a076ef85762d2db8c0e76c96487902982fd73acdd082c746d" exitCode=0 Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.828694 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76htj" event={"ID":"28d22c43-d675-4c18-ad9e-fd33d8fea3e9","Type":"ContainerDied","Data":"2bd2c7369ac1452a076ef85762d2db8c0e76c96487902982fd73acdd082c746d"} Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.828717 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-76htj" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.828736 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76htj" event={"ID":"28d22c43-d675-4c18-ad9e-fd33d8fea3e9","Type":"ContainerDied","Data":"a5aaba29306fe039badc98d60ea3fffea8811236e00dc5ed67e77fb084e834e8"} Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.828760 4992 scope.go:117] "RemoveContainer" containerID="2bd2c7369ac1452a076ef85762d2db8c0e76c96487902982fd73acdd082c746d" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.857075 4992 scope.go:117] "RemoveContainer" containerID="6cf09a0c317a6d847d3c64a4f27147d9adf51f9367ff4d986a0d720c7fc94677" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.876503 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-76htj"] Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.888045 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-76htj"] Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.890792 4992 scope.go:117] "RemoveContainer" containerID="94e139f0c99930626d055e47475e517bc5fb26aaedb8e349431e6e63005a06bf" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.952475 4992 scope.go:117] "RemoveContainer" containerID="2bd2c7369ac1452a076ef85762d2db8c0e76c96487902982fd73acdd082c746d" Jan 31 10:22:50 crc kubenswrapper[4992]: E0131 10:22:50.952954 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bd2c7369ac1452a076ef85762d2db8c0e76c96487902982fd73acdd082c746d\": container with ID starting with 2bd2c7369ac1452a076ef85762d2db8c0e76c96487902982fd73acdd082c746d not found: ID does not exist" containerID="2bd2c7369ac1452a076ef85762d2db8c0e76c96487902982fd73acdd082c746d" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.952984 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bd2c7369ac1452a076ef85762d2db8c0e76c96487902982fd73acdd082c746d"} err="failed to get container status \"2bd2c7369ac1452a076ef85762d2db8c0e76c96487902982fd73acdd082c746d\": rpc error: code = NotFound desc = could not find container \"2bd2c7369ac1452a076ef85762d2db8c0e76c96487902982fd73acdd082c746d\": container with ID starting with 2bd2c7369ac1452a076ef85762d2db8c0e76c96487902982fd73acdd082c746d not found: ID does not exist" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.953005 4992 scope.go:117] "RemoveContainer" containerID="6cf09a0c317a6d847d3c64a4f27147d9adf51f9367ff4d986a0d720c7fc94677" Jan 31 10:22:50 crc kubenswrapper[4992]: E0131 10:22:50.953245 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cf09a0c317a6d847d3c64a4f27147d9adf51f9367ff4d986a0d720c7fc94677\": container with ID starting with 6cf09a0c317a6d847d3c64a4f27147d9adf51f9367ff4d986a0d720c7fc94677 not found: ID does not exist" containerID="6cf09a0c317a6d847d3c64a4f27147d9adf51f9367ff4d986a0d720c7fc94677" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.953277 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cf09a0c317a6d847d3c64a4f27147d9adf51f9367ff4d986a0d720c7fc94677"} err="failed to get container status \"6cf09a0c317a6d847d3c64a4f27147d9adf51f9367ff4d986a0d720c7fc94677\": rpc error: code = NotFound desc = could not find container \"6cf09a0c317a6d847d3c64a4f27147d9adf51f9367ff4d986a0d720c7fc94677\": container with ID starting with 6cf09a0c317a6d847d3c64a4f27147d9adf51f9367ff4d986a0d720c7fc94677 not found: ID does not exist" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.953296 4992 scope.go:117] "RemoveContainer" containerID="94e139f0c99930626d055e47475e517bc5fb26aaedb8e349431e6e63005a06bf" Jan 31 10:22:50 crc kubenswrapper[4992]: E0131 10:22:50.953707 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e139f0c99930626d055e47475e517bc5fb26aaedb8e349431e6e63005a06bf\": container with ID starting with 94e139f0c99930626d055e47475e517bc5fb26aaedb8e349431e6e63005a06bf not found: ID does not exist" containerID="94e139f0c99930626d055e47475e517bc5fb26aaedb8e349431e6e63005a06bf" Jan 31 10:22:50 crc kubenswrapper[4992]: I0131 10:22:50.953727 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e139f0c99930626d055e47475e517bc5fb26aaedb8e349431e6e63005a06bf"} err="failed to get container status \"94e139f0c99930626d055e47475e517bc5fb26aaedb8e349431e6e63005a06bf\": rpc error: code = NotFound desc = could not find container \"94e139f0c99930626d055e47475e517bc5fb26aaedb8e349431e6e63005a06bf\": container with ID starting with 94e139f0c99930626d055e47475e517bc5fb26aaedb8e349431e6e63005a06bf not found: ID does not exist" Jan 31 10:22:51 crc kubenswrapper[4992]: I0131 10:22:51.198014 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28d22c43-d675-4c18-ad9e-fd33d8fea3e9" path="/var/lib/kubelet/pods/28d22c43-d675-4c18-ad9e-fd33d8fea3e9/volumes" Jan 31 10:23:15 crc kubenswrapper[4992]: I0131 10:23:15.301056 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:23:15 crc kubenswrapper[4992]: I0131 10:23:15.301774 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:23:15 crc kubenswrapper[4992]: I0131 10:23:15.301867 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 10:23:15 crc kubenswrapper[4992]: I0131 10:23:15.303503 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5c91d95f3f64c3ed9c1d8e49f5084209924bbe3a1430c8bb1fda925302d8789"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 10:23:15 crc kubenswrapper[4992]: I0131 10:23:15.303615 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://c5c91d95f3f64c3ed9c1d8e49f5084209924bbe3a1430c8bb1fda925302d8789" gracePeriod=600 Jan 31 10:23:16 crc kubenswrapper[4992]: I0131 10:23:16.212860 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="c5c91d95f3f64c3ed9c1d8e49f5084209924bbe3a1430c8bb1fda925302d8789" exitCode=0 Jan 31 10:23:16 crc kubenswrapper[4992]: I0131 10:23:16.213202 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"c5c91d95f3f64c3ed9c1d8e49f5084209924bbe3a1430c8bb1fda925302d8789"} Jan 31 10:23:16 crc kubenswrapper[4992]: I0131 10:23:16.213368 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb"} Jan 31 10:23:16 crc kubenswrapper[4992]: I0131 10:23:16.213389 4992 scope.go:117] "RemoveContainer" containerID="f0b43da8e5cd505bb9ba2aa7c0f6a39f9d991bf6bf295bcc3abf5bcca7dc40bf" Jan 31 10:24:16 crc kubenswrapper[4992]: I0131 10:24:16.080279 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-wd7m9"] Jan 31 10:24:16 crc kubenswrapper[4992]: I0131 10:24:16.091183 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-wd7m9"] Jan 31 10:24:17 crc kubenswrapper[4992]: I0131 10:24:17.031399 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-eb4a-account-create-update-7lc59"] Jan 31 10:24:17 crc kubenswrapper[4992]: I0131 10:24:17.052914 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-eb4a-account-create-update-7lc59"] Jan 31 10:24:17 crc kubenswrapper[4992]: I0131 10:24:17.193026 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34b795bf-45ca-4c3c-84a7-39a764219cc2" path="/var/lib/kubelet/pods/34b795bf-45ca-4c3c-84a7-39a764219cc2/volumes" Jan 31 10:24:17 crc kubenswrapper[4992]: I0131 10:24:17.193580 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94de8369-bb29-499c-b221-bb53527a84e2" path="/var/lib/kubelet/pods/94de8369-bb29-499c-b221-bb53527a84e2/volumes" Jan 31 10:24:19 crc kubenswrapper[4992]: I0131 10:24:19.581318 4992 scope.go:117] "RemoveContainer" containerID="b5a89fb17016cdaeb2b2de355d0a00a38d2e7b25eff561438ee8017c8a3ac40b" Jan 31 10:24:19 crc kubenswrapper[4992]: I0131 10:24:19.613504 4992 scope.go:117] "RemoveContainer" containerID="c1cfe949603112a8d23a868a9c4a3f01b6c36115c62a39701c09e41de5708d82" Jan 31 10:24:33 crc kubenswrapper[4992]: I0131 10:24:33.025944 4992 generic.go:334] "Generic (PLEG): container finished" podID="aa081a77-baa5-4663-899c-9738d3904244" containerID="d17c89e86c2f2b45d61b8e5ed7ee0e27a5eb9af7a5cfde48ae56859ba2f19336" exitCode=1 Jan 31 10:24:33 crc kubenswrapper[4992]: I0131 10:24:33.026084 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"aa081a77-baa5-4663-899c-9738d3904244","Type":"ContainerDied","Data":"d17c89e86c2f2b45d61b8e5ed7ee0e27a5eb9af7a5cfde48ae56859ba2f19336"} Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.509068 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.625786 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest-s01-single-test"] Jan 31 10:24:34 crc kubenswrapper[4992]: E0131 10:24:34.626311 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa081a77-baa5-4663-899c-9738d3904244" containerName="tempest-tests-tempest-tests-runner" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.626337 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa081a77-baa5-4663-899c-9738d3904244" containerName="tempest-tests-tempest-tests-runner" Jan 31 10:24:34 crc kubenswrapper[4992]: E0131 10:24:34.626357 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d22c43-d675-4c18-ad9e-fd33d8fea3e9" containerName="extract-utilities" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.626368 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d22c43-d675-4c18-ad9e-fd33d8fea3e9" containerName="extract-utilities" Jan 31 10:24:34 crc kubenswrapper[4992]: E0131 10:24:34.626403 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d22c43-d675-4c18-ad9e-fd33d8fea3e9" containerName="extract-content" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.626439 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d22c43-d675-4c18-ad9e-fd33d8fea3e9" containerName="extract-content" Jan 31 10:24:34 crc kubenswrapper[4992]: E0131 10:24:34.626465 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d22c43-d675-4c18-ad9e-fd33d8fea3e9" containerName="registry-server" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.626474 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d22c43-d675-4c18-ad9e-fd33d8fea3e9" containerName="registry-server" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.626703 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa081a77-baa5-4663-899c-9738d3904244" containerName="tempest-tests-tempest-tests-runner" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.626741 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d22c43-d675-4c18-ad9e-fd33d8fea3e9" containerName="registry-server" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.627519 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.629967 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s1" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.631622 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s1" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.635725 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-test"] Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.667469 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"aa081a77-baa5-4663-899c-9738d3904244\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.667844 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ssh-key\") pod \"aa081a77-baa5-4663-899c-9738d3904244\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.667956 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbrvm\" (UniqueName: \"kubernetes.io/projected/aa081a77-baa5-4663-899c-9738d3904244-kube-api-access-dbrvm\") pod \"aa081a77-baa5-4663-899c-9738d3904244\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.668095 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aa081a77-baa5-4663-899c-9738d3904244-test-operator-ephemeral-temporary\") pod \"aa081a77-baa5-4663-899c-9738d3904244\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.668245 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-openstack-config-secret\") pod \"aa081a77-baa5-4663-899c-9738d3904244\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.668726 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ca-certs\") pod \"aa081a77-baa5-4663-899c-9738d3904244\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.668869 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aa081a77-baa5-4663-899c-9738d3904244-openstack-config\") pod \"aa081a77-baa5-4663-899c-9738d3904244\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.668553 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa081a77-baa5-4663-899c-9738d3904244-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "aa081a77-baa5-4663-899c-9738d3904244" (UID: "aa081a77-baa5-4663-899c-9738d3904244"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.669199 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aa081a77-baa5-4663-899c-9738d3904244-test-operator-ephemeral-workdir\") pod \"aa081a77-baa5-4663-899c-9738d3904244\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.669321 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa081a77-baa5-4663-899c-9738d3904244-config-data\") pod \"aa081a77-baa5-4663-899c-9738d3904244\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.669383 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ceph\") pod \"aa081a77-baa5-4663-899c-9738d3904244\" (UID: \"aa081a77-baa5-4663-899c-9738d3904244\") " Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.670500 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa081a77-baa5-4663-899c-9738d3904244-config-data" (OuterVolumeSpecName: "config-data") pod "aa081a77-baa5-4663-899c-9738d3904244" (UID: "aa081a77-baa5-4663-899c-9738d3904244"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.670514 4992 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aa081a77-baa5-4663-899c-9738d3904244-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.680621 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa081a77-baa5-4663-899c-9738d3904244-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "aa081a77-baa5-4663-899c-9738d3904244" (UID: "aa081a77-baa5-4663-899c-9738d3904244"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.681842 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "aa081a77-baa5-4663-899c-9738d3904244" (UID: "aa081a77-baa5-4663-899c-9738d3904244"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.681857 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa081a77-baa5-4663-899c-9738d3904244-kube-api-access-dbrvm" (OuterVolumeSpecName: "kube-api-access-dbrvm") pod "aa081a77-baa5-4663-899c-9738d3904244" (UID: "aa081a77-baa5-4663-899c-9738d3904244"). InnerVolumeSpecName "kube-api-access-dbrvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.683359 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ceph" (OuterVolumeSpecName: "ceph") pod "aa081a77-baa5-4663-899c-9738d3904244" (UID: "aa081a77-baa5-4663-899c-9738d3904244"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.706779 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "aa081a77-baa5-4663-899c-9738d3904244" (UID: "aa081a77-baa5-4663-899c-9738d3904244"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.709516 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aa081a77-baa5-4663-899c-9738d3904244" (UID: "aa081a77-baa5-4663-899c-9738d3904244"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.720344 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "aa081a77-baa5-4663-899c-9738d3904244" (UID: "aa081a77-baa5-4663-899c-9738d3904244"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.742651 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa081a77-baa5-4663-899c-9738d3904244-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "aa081a77-baa5-4663-899c-9738d3904244" (UID: "aa081a77-baa5-4663-899c-9738d3904244"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.772489 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.772561 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b763b768-dbea-43f3-a06b-b773c6332ea5-openstack-config\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.772667 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ceph\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.772751 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b763b768-dbea-43f3-a06b-b773c6332ea5-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.772837 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.772939 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b763b768-dbea-43f3-a06b-b773c6332ea5-config-data\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.772975 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b763b768-dbea-43f3-a06b-b773c6332ea5-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.773025 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhrgg\" (UniqueName: \"kubernetes.io/projected/b763b768-dbea-43f3-a06b-b773c6332ea5-kube-api-access-fhrgg\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.773076 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ssh-key\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.773173 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ca-certs\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.773263 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa081a77-baa5-4663-899c-9738d3904244-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.773334 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.773357 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.773372 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbrvm\" (UniqueName: \"kubernetes.io/projected/aa081a77-baa5-4663-899c-9738d3904244-kube-api-access-dbrvm\") on node \"crc\" DevicePath \"\"" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.773385 4992 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.773396 4992 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aa081a77-baa5-4663-899c-9738d3904244-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.773407 4992 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aa081a77-baa5-4663-899c-9738d3904244-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.773436 4992 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aa081a77-baa5-4663-899c-9738d3904244-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.796390 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.875225 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b763b768-dbea-43f3-a06b-b773c6332ea5-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.875332 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b763b768-dbea-43f3-a06b-b773c6332ea5-config-data\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.875357 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b763b768-dbea-43f3-a06b-b773c6332ea5-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.875377 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhrgg\" (UniqueName: \"kubernetes.io/projected/b763b768-dbea-43f3-a06b-b773c6332ea5-kube-api-access-fhrgg\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.875408 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ssh-key\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.875457 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ca-certs\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.875494 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.875519 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b763b768-dbea-43f3-a06b-b773c6332ea5-openstack-config\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.875548 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ceph\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.876710 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b763b768-dbea-43f3-a06b-b773c6332ea5-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.876940 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b763b768-dbea-43f3-a06b-b773c6332ea5-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.877308 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b763b768-dbea-43f3-a06b-b773c6332ea5-config-data\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.878618 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b763b768-dbea-43f3-a06b-b773c6332ea5-openstack-config\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.880873 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ssh-key\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.881144 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-openstack-config-secret\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.881303 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ceph\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.881975 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ca-certs\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.898045 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhrgg\" (UniqueName: \"kubernetes.io/projected/b763b768-dbea-43f3-a06b-b773c6332ea5-kube-api-access-fhrgg\") pod \"tempest-tests-tempest-s01-single-test\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:34 crc kubenswrapper[4992]: I0131 10:24:34.949087 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:24:35 crc kubenswrapper[4992]: I0131 10:24:35.052158 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s00-full" event={"ID":"aa081a77-baa5-4663-899c-9738d3904244","Type":"ContainerDied","Data":"31ac382c3d37cd5e3e875b113d7c7daa6f313fbc81b0d7bf121d86ea8d3592cb"} Jan 31 10:24:35 crc kubenswrapper[4992]: I0131 10:24:35.052229 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31ac382c3d37cd5e3e875b113d7c7daa6f313fbc81b0d7bf121d86ea8d3592cb" Jan 31 10:24:35 crc kubenswrapper[4992]: I0131 10:24:35.052977 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s00-full" Jan 31 10:24:35 crc kubenswrapper[4992]: I0131 10:24:35.536317 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest-s01-single-test"] Jan 31 10:24:36 crc kubenswrapper[4992]: I0131 10:24:36.041567 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-s9dt7"] Jan 31 10:24:36 crc kubenswrapper[4992]: I0131 10:24:36.051767 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-s9dt7"] Jan 31 10:24:36 crc kubenswrapper[4992]: I0131 10:24:36.063197 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"b763b768-dbea-43f3-a06b-b773c6332ea5","Type":"ContainerStarted","Data":"d33632735864591095b008b633ad5ae73592f85cb7aa5fe1f76ebe699f63edae"} Jan 31 10:24:37 crc kubenswrapper[4992]: I0131 10:24:37.074616 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"b763b768-dbea-43f3-a06b-b773c6332ea5","Type":"ContainerStarted","Data":"14034fd76f71a456644686aaee78e97a61f7b82836ded31359ce68e63f2ef149"} Jan 31 10:24:37 crc kubenswrapper[4992]: I0131 10:24:37.114036 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest-s01-single-test" podStartSLOduration=3.114008419 podStartE2EDuration="3.114008419s" podCreationTimestamp="2026-01-31 10:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 10:24:37.096738002 +0000 UTC m=+3573.068130009" watchObservedRunningTime="2026-01-31 10:24:37.114008419 +0000 UTC m=+3573.085400446" Jan 31 10:24:37 crc kubenswrapper[4992]: I0131 10:24:37.199094 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90bb7b32-ced9-4f29-8649-ceb6f46b89e5" path="/var/lib/kubelet/pods/90bb7b32-ced9-4f29-8649-ceb6f46b89e5/volumes" Jan 31 10:25:15 crc kubenswrapper[4992]: I0131 10:25:15.301073 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:25:15 crc kubenswrapper[4992]: I0131 10:25:15.301589 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:25:19 crc kubenswrapper[4992]: I0131 10:25:19.732017 4992 scope.go:117] "RemoveContainer" containerID="2b36f4437cd102392de284aeabdf3bfd255dbbfce36dbded673cc8a3a3e9f180" Jan 31 10:25:45 crc kubenswrapper[4992]: I0131 10:25:45.301735 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:25:45 crc kubenswrapper[4992]: I0131 10:25:45.302485 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:26:15 crc kubenswrapper[4992]: I0131 10:26:15.301580 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:26:15 crc kubenswrapper[4992]: I0131 10:26:15.302179 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:26:15 crc kubenswrapper[4992]: I0131 10:26:15.302312 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 10:26:15 crc kubenswrapper[4992]: I0131 10:26:15.303202 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 10:26:15 crc kubenswrapper[4992]: I0131 10:26:15.303279 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" gracePeriod=600 Jan 31 10:26:15 crc kubenswrapper[4992]: E0131 10:26:15.423677 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:26:17 crc kubenswrapper[4992]: I0131 10:26:17.192614 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" exitCode=0 Jan 31 10:26:17 crc kubenswrapper[4992]: I0131 10:26:17.198185 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb"} Jan 31 10:26:17 crc kubenswrapper[4992]: I0131 10:26:17.198240 4992 scope.go:117] "RemoveContainer" containerID="c5c91d95f3f64c3ed9c1d8e49f5084209924bbe3a1430c8bb1fda925302d8789" Jan 31 10:26:17 crc kubenswrapper[4992]: I0131 10:26:17.198939 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:26:17 crc kubenswrapper[4992]: E0131 10:26:17.199215 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:26:28 crc kubenswrapper[4992]: I0131 10:26:28.183916 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:26:28 crc kubenswrapper[4992]: E0131 10:26:28.185308 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:26:41 crc kubenswrapper[4992]: I0131 10:26:41.183986 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:26:41 crc kubenswrapper[4992]: E0131 10:26:41.185233 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:26:53 crc kubenswrapper[4992]: I0131 10:26:53.183254 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:26:53 crc kubenswrapper[4992]: E0131 10:26:53.184302 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:27:07 crc kubenswrapper[4992]: I0131 10:27:07.184489 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:27:07 crc kubenswrapper[4992]: E0131 10:27:07.187197 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:27:19 crc kubenswrapper[4992]: I0131 10:27:19.183644 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:27:19 crc kubenswrapper[4992]: E0131 10:27:19.184686 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:27:32 crc kubenswrapper[4992]: I0131 10:27:32.182726 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:27:32 crc kubenswrapper[4992]: E0131 10:27:32.183493 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:27:46 crc kubenswrapper[4992]: I0131 10:27:46.183079 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:27:46 crc kubenswrapper[4992]: E0131 10:27:46.184074 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:28:01 crc kubenswrapper[4992]: I0131 10:28:01.182315 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:28:01 crc kubenswrapper[4992]: E0131 10:28:01.183167 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:28:14 crc kubenswrapper[4992]: I0131 10:28:14.184162 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:28:14 crc kubenswrapper[4992]: E0131 10:28:14.185348 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:28:27 crc kubenswrapper[4992]: I0131 10:28:27.187303 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:28:27 crc kubenswrapper[4992]: E0131 10:28:27.188739 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:28:42 crc kubenswrapper[4992]: I0131 10:28:42.182409 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:28:42 crc kubenswrapper[4992]: E0131 10:28:42.183086 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:28:55 crc kubenswrapper[4992]: I0131 10:28:55.194857 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:28:55 crc kubenswrapper[4992]: E0131 10:28:55.196161 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:29:06 crc kubenswrapper[4992]: I0131 10:29:06.183237 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:29:06 crc kubenswrapper[4992]: E0131 10:29:06.184043 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:29:18 crc kubenswrapper[4992]: I0131 10:29:18.183177 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:29:18 crc kubenswrapper[4992]: E0131 10:29:18.184290 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:29:32 crc kubenswrapper[4992]: I0131 10:29:32.182725 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:29:32 crc kubenswrapper[4992]: E0131 10:29:32.183373 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:29:46 crc kubenswrapper[4992]: I0131 10:29:46.183151 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:29:46 crc kubenswrapper[4992]: E0131 10:29:46.184251 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.182719 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:30:00 crc kubenswrapper[4992]: E0131 10:30:00.183430 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.202834 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws"] Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.204340 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.206778 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.206955 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.217640 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws"] Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.237543 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-config-volume\") pod \"collect-profiles-29497590-vzhws\" (UID: \"9b4b65a0-8101-4d70-a5b7-94dc991a0a71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.237649 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-secret-volume\") pod \"collect-profiles-29497590-vzhws\" (UID: \"9b4b65a0-8101-4d70-a5b7-94dc991a0a71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.237866 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v58fl\" (UniqueName: \"kubernetes.io/projected/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-kube-api-access-v58fl\") pod \"collect-profiles-29497590-vzhws\" (UID: \"9b4b65a0-8101-4d70-a5b7-94dc991a0a71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.339339 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v58fl\" (UniqueName: \"kubernetes.io/projected/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-kube-api-access-v58fl\") pod \"collect-profiles-29497590-vzhws\" (UID: \"9b4b65a0-8101-4d70-a5b7-94dc991a0a71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.339405 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-config-volume\") pod \"collect-profiles-29497590-vzhws\" (UID: \"9b4b65a0-8101-4d70-a5b7-94dc991a0a71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.339472 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-secret-volume\") pod \"collect-profiles-29497590-vzhws\" (UID: \"9b4b65a0-8101-4d70-a5b7-94dc991a0a71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.340363 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-config-volume\") pod \"collect-profiles-29497590-vzhws\" (UID: \"9b4b65a0-8101-4d70-a5b7-94dc991a0a71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.346109 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-secret-volume\") pod \"collect-profiles-29497590-vzhws\" (UID: \"9b4b65a0-8101-4d70-a5b7-94dc991a0a71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.377834 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v58fl\" (UniqueName: \"kubernetes.io/projected/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-kube-api-access-v58fl\") pod \"collect-profiles-29497590-vzhws\" (UID: \"9b4b65a0-8101-4d70-a5b7-94dc991a0a71\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.526944 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" Jan 31 10:30:00 crc kubenswrapper[4992]: I0131 10:30:00.975770 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws"] Jan 31 10:30:01 crc kubenswrapper[4992]: I0131 10:30:01.683934 4992 generic.go:334] "Generic (PLEG): container finished" podID="9b4b65a0-8101-4d70-a5b7-94dc991a0a71" containerID="95926118a9fd90d20a5dd69584ab6ac9aafca5168a526958276616863ed485b1" exitCode=0 Jan 31 10:30:01 crc kubenswrapper[4992]: I0131 10:30:01.684006 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" event={"ID":"9b4b65a0-8101-4d70-a5b7-94dc991a0a71","Type":"ContainerDied","Data":"95926118a9fd90d20a5dd69584ab6ac9aafca5168a526958276616863ed485b1"} Jan 31 10:30:01 crc kubenswrapper[4992]: I0131 10:30:01.684503 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" event={"ID":"9b4b65a0-8101-4d70-a5b7-94dc991a0a71","Type":"ContainerStarted","Data":"c3c2b121d8a6255744f2b187b186814c7c23d741af8db8fc5e78cb0ef9e81866"} Jan 31 10:30:03 crc kubenswrapper[4992]: I0131 10:30:03.045022 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" Jan 31 10:30:03 crc kubenswrapper[4992]: I0131 10:30:03.092651 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v58fl\" (UniqueName: \"kubernetes.io/projected/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-kube-api-access-v58fl\") pod \"9b4b65a0-8101-4d70-a5b7-94dc991a0a71\" (UID: \"9b4b65a0-8101-4d70-a5b7-94dc991a0a71\") " Jan 31 10:30:03 crc kubenswrapper[4992]: I0131 10:30:03.092721 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-secret-volume\") pod \"9b4b65a0-8101-4d70-a5b7-94dc991a0a71\" (UID: \"9b4b65a0-8101-4d70-a5b7-94dc991a0a71\") " Jan 31 10:30:03 crc kubenswrapper[4992]: I0131 10:30:03.092893 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-config-volume\") pod \"9b4b65a0-8101-4d70-a5b7-94dc991a0a71\" (UID: \"9b4b65a0-8101-4d70-a5b7-94dc991a0a71\") " Jan 31 10:30:03 crc kubenswrapper[4992]: I0131 10:30:03.093463 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-config-volume" (OuterVolumeSpecName: "config-volume") pod "9b4b65a0-8101-4d70-a5b7-94dc991a0a71" (UID: "9b4b65a0-8101-4d70-a5b7-94dc991a0a71"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:30:03 crc kubenswrapper[4992]: I0131 10:30:03.105685 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9b4b65a0-8101-4d70-a5b7-94dc991a0a71" (UID: "9b4b65a0-8101-4d70-a5b7-94dc991a0a71"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:30:03 crc kubenswrapper[4992]: I0131 10:30:03.105829 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-kube-api-access-v58fl" (OuterVolumeSpecName: "kube-api-access-v58fl") pod "9b4b65a0-8101-4d70-a5b7-94dc991a0a71" (UID: "9b4b65a0-8101-4d70-a5b7-94dc991a0a71"). InnerVolumeSpecName "kube-api-access-v58fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:30:03 crc kubenswrapper[4992]: I0131 10:30:03.195795 4992 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 10:30:03 crc kubenswrapper[4992]: I0131 10:30:03.195841 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v58fl\" (UniqueName: \"kubernetes.io/projected/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-kube-api-access-v58fl\") on node \"crc\" DevicePath \"\"" Jan 31 10:30:03 crc kubenswrapper[4992]: I0131 10:30:03.195863 4992 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b4b65a0-8101-4d70-a5b7-94dc991a0a71-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 10:30:03 crc kubenswrapper[4992]: I0131 10:30:03.718430 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" event={"ID":"9b4b65a0-8101-4d70-a5b7-94dc991a0a71","Type":"ContainerDied","Data":"c3c2b121d8a6255744f2b187b186814c7c23d741af8db8fc5e78cb0ef9e81866"} Jan 31 10:30:03 crc kubenswrapper[4992]: I0131 10:30:03.718814 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497590-vzhws" Jan 31 10:30:03 crc kubenswrapper[4992]: I0131 10:30:03.718829 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3c2b121d8a6255744f2b187b186814c7c23d741af8db8fc5e78cb0ef9e81866" Jan 31 10:30:04 crc kubenswrapper[4992]: I0131 10:30:04.135369 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw"] Jan 31 10:30:04 crc kubenswrapper[4992]: I0131 10:30:04.145284 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497545-bmsmw"] Jan 31 10:30:05 crc kubenswrapper[4992]: I0131 10:30:05.204755 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f0b912-33a5-4498-84cf-e2f859245bb6" path="/var/lib/kubelet/pods/d0f0b912-33a5-4498-84cf-e2f859245bb6/volumes" Jan 31 10:30:15 crc kubenswrapper[4992]: I0131 10:30:15.190628 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:30:15 crc kubenswrapper[4992]: E0131 10:30:15.191629 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:30:19 crc kubenswrapper[4992]: I0131 10:30:19.923142 4992 scope.go:117] "RemoveContainer" containerID="c8eb930ea3eb877bff2fbce7519ca54fb8f063e14b38f3e0ffb4d680157df299" Jan 31 10:30:26 crc kubenswrapper[4992]: I0131 10:30:26.183116 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:30:26 crc kubenswrapper[4992]: E0131 10:30:26.186227 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:30:37 crc kubenswrapper[4992]: I0131 10:30:37.183177 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:30:37 crc kubenswrapper[4992]: E0131 10:30:37.184573 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:30:50 crc kubenswrapper[4992]: I0131 10:30:50.182873 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:30:50 crc kubenswrapper[4992]: E0131 10:30:50.183603 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:30:58 crc kubenswrapper[4992]: I0131 10:30:58.541258 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2dw2c"] Jan 31 10:30:58 crc kubenswrapper[4992]: E0131 10:30:58.542482 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4b65a0-8101-4d70-a5b7-94dc991a0a71" containerName="collect-profiles" Jan 31 10:30:58 crc kubenswrapper[4992]: I0131 10:30:58.542498 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4b65a0-8101-4d70-a5b7-94dc991a0a71" containerName="collect-profiles" Jan 31 10:30:58 crc kubenswrapper[4992]: I0131 10:30:58.542758 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4b65a0-8101-4d70-a5b7-94dc991a0a71" containerName="collect-profiles" Jan 31 10:30:58 crc kubenswrapper[4992]: I0131 10:30:58.544379 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:30:58 crc kubenswrapper[4992]: I0131 10:30:58.565343 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dw2c"] Jan 31 10:30:58 crc kubenswrapper[4992]: I0131 10:30:58.664780 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dknbf\" (UniqueName: \"kubernetes.io/projected/218365ab-603a-4592-a6fd-76b3a26f79d6-kube-api-access-dknbf\") pod \"redhat-marketplace-2dw2c\" (UID: \"218365ab-603a-4592-a6fd-76b3a26f79d6\") " pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:30:58 crc kubenswrapper[4992]: I0131 10:30:58.665031 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218365ab-603a-4592-a6fd-76b3a26f79d6-catalog-content\") pod \"redhat-marketplace-2dw2c\" (UID: \"218365ab-603a-4592-a6fd-76b3a26f79d6\") " pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:30:58 crc kubenswrapper[4992]: I0131 10:30:58.665311 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218365ab-603a-4592-a6fd-76b3a26f79d6-utilities\") pod \"redhat-marketplace-2dw2c\" (UID: \"218365ab-603a-4592-a6fd-76b3a26f79d6\") " pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:30:58 crc kubenswrapper[4992]: I0131 10:30:58.767200 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218365ab-603a-4592-a6fd-76b3a26f79d6-catalog-content\") pod \"redhat-marketplace-2dw2c\" (UID: \"218365ab-603a-4592-a6fd-76b3a26f79d6\") " pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:30:58 crc kubenswrapper[4992]: I0131 10:30:58.767439 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218365ab-603a-4592-a6fd-76b3a26f79d6-utilities\") pod \"redhat-marketplace-2dw2c\" (UID: \"218365ab-603a-4592-a6fd-76b3a26f79d6\") " pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:30:58 crc kubenswrapper[4992]: I0131 10:30:58.767515 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dknbf\" (UniqueName: \"kubernetes.io/projected/218365ab-603a-4592-a6fd-76b3a26f79d6-kube-api-access-dknbf\") pod \"redhat-marketplace-2dw2c\" (UID: \"218365ab-603a-4592-a6fd-76b3a26f79d6\") " pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:30:58 crc kubenswrapper[4992]: I0131 10:30:58.767855 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218365ab-603a-4592-a6fd-76b3a26f79d6-catalog-content\") pod \"redhat-marketplace-2dw2c\" (UID: \"218365ab-603a-4592-a6fd-76b3a26f79d6\") " pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:30:58 crc kubenswrapper[4992]: I0131 10:30:58.767948 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218365ab-603a-4592-a6fd-76b3a26f79d6-utilities\") pod \"redhat-marketplace-2dw2c\" (UID: \"218365ab-603a-4592-a6fd-76b3a26f79d6\") " pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:30:58 crc kubenswrapper[4992]: I0131 10:30:58.802051 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dknbf\" (UniqueName: \"kubernetes.io/projected/218365ab-603a-4592-a6fd-76b3a26f79d6-kube-api-access-dknbf\") pod \"redhat-marketplace-2dw2c\" (UID: \"218365ab-603a-4592-a6fd-76b3a26f79d6\") " pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:30:58 crc kubenswrapper[4992]: I0131 10:30:58.909947 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:30:59 crc kubenswrapper[4992]: I0131 10:30:59.384123 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dw2c"] Jan 31 10:30:59 crc kubenswrapper[4992]: W0131 10:30:59.389665 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod218365ab_603a_4592_a6fd_76b3a26f79d6.slice/crio-37e23bed2d2d67d7d767f69fb529c2e9946c43b4e9245e1b934cf6a0a18d47d5 WatchSource:0}: Error finding container 37e23bed2d2d67d7d767f69fb529c2e9946c43b4e9245e1b934cf6a0a18d47d5: Status 404 returned error can't find the container with id 37e23bed2d2d67d7d767f69fb529c2e9946c43b4e9245e1b934cf6a0a18d47d5 Jan 31 10:31:00 crc kubenswrapper[4992]: I0131 10:31:00.349238 4992 generic.go:334] "Generic (PLEG): container finished" podID="218365ab-603a-4592-a6fd-76b3a26f79d6" containerID="10f1ed05476b7cf9a32e04caa19381dd52b2caf1af181344dea2e516dc254b81" exitCode=0 Jan 31 10:31:00 crc kubenswrapper[4992]: I0131 10:31:00.349674 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dw2c" event={"ID":"218365ab-603a-4592-a6fd-76b3a26f79d6","Type":"ContainerDied","Data":"10f1ed05476b7cf9a32e04caa19381dd52b2caf1af181344dea2e516dc254b81"} Jan 31 10:31:00 crc kubenswrapper[4992]: I0131 10:31:00.349726 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dw2c" event={"ID":"218365ab-603a-4592-a6fd-76b3a26f79d6","Type":"ContainerStarted","Data":"37e23bed2d2d67d7d767f69fb529c2e9946c43b4e9245e1b934cf6a0a18d47d5"} Jan 31 10:31:00 crc kubenswrapper[4992]: I0131 10:31:00.354078 4992 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 10:31:01 crc kubenswrapper[4992]: I0131 10:31:01.361004 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dw2c" event={"ID":"218365ab-603a-4592-a6fd-76b3a26f79d6","Type":"ContainerStarted","Data":"884853524ca6041425cc8d6aba8815e88ffd5d5022a83f41359ab1eade117403"} Jan 31 10:31:02 crc kubenswrapper[4992]: I0131 10:31:02.183882 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:31:02 crc kubenswrapper[4992]: E0131 10:31:02.184686 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:31:02 crc kubenswrapper[4992]: I0131 10:31:02.396233 4992 generic.go:334] "Generic (PLEG): container finished" podID="218365ab-603a-4592-a6fd-76b3a26f79d6" containerID="884853524ca6041425cc8d6aba8815e88ffd5d5022a83f41359ab1eade117403" exitCode=0 Jan 31 10:31:02 crc kubenswrapper[4992]: I0131 10:31:02.396595 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dw2c" event={"ID":"218365ab-603a-4592-a6fd-76b3a26f79d6","Type":"ContainerDied","Data":"884853524ca6041425cc8d6aba8815e88ffd5d5022a83f41359ab1eade117403"} Jan 31 10:31:03 crc kubenswrapper[4992]: I0131 10:31:03.405319 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dw2c" event={"ID":"218365ab-603a-4592-a6fd-76b3a26f79d6","Type":"ContainerStarted","Data":"f82b0a2fafb922d0f36ac676e8c5e675d915dd910c8d70548b8f69c33c836ea6"} Jan 31 10:31:03 crc kubenswrapper[4992]: I0131 10:31:03.429733 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2dw2c" podStartSLOduration=3.023398045 podStartE2EDuration="5.429713321s" podCreationTimestamp="2026-01-31 10:30:58 +0000 UTC" firstStartedPulling="2026-01-31 10:31:00.353521707 +0000 UTC m=+3956.324913714" lastFinishedPulling="2026-01-31 10:31:02.759836993 +0000 UTC m=+3958.731228990" observedRunningTime="2026-01-31 10:31:03.427526689 +0000 UTC m=+3959.398918686" watchObservedRunningTime="2026-01-31 10:31:03.429713321 +0000 UTC m=+3959.401105308" Jan 31 10:31:08 crc kubenswrapper[4992]: I0131 10:31:08.910469 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:31:08 crc kubenswrapper[4992]: I0131 10:31:08.911134 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:31:08 crc kubenswrapper[4992]: I0131 10:31:08.980948 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:31:09 crc kubenswrapper[4992]: I0131 10:31:09.499539 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:31:09 crc kubenswrapper[4992]: I0131 10:31:09.567135 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dw2c"] Jan 31 10:31:11 crc kubenswrapper[4992]: I0131 10:31:11.483832 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2dw2c" podUID="218365ab-603a-4592-a6fd-76b3a26f79d6" containerName="registry-server" containerID="cri-o://f82b0a2fafb922d0f36ac676e8c5e675d915dd910c8d70548b8f69c33c836ea6" gracePeriod=2 Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.027069 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.100939 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dknbf\" (UniqueName: \"kubernetes.io/projected/218365ab-603a-4592-a6fd-76b3a26f79d6-kube-api-access-dknbf\") pod \"218365ab-603a-4592-a6fd-76b3a26f79d6\" (UID: \"218365ab-603a-4592-a6fd-76b3a26f79d6\") " Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.101143 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218365ab-603a-4592-a6fd-76b3a26f79d6-catalog-content\") pod \"218365ab-603a-4592-a6fd-76b3a26f79d6\" (UID: \"218365ab-603a-4592-a6fd-76b3a26f79d6\") " Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.101322 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218365ab-603a-4592-a6fd-76b3a26f79d6-utilities\") pod \"218365ab-603a-4592-a6fd-76b3a26f79d6\" (UID: \"218365ab-603a-4592-a6fd-76b3a26f79d6\") " Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.102626 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/218365ab-603a-4592-a6fd-76b3a26f79d6-utilities" (OuterVolumeSpecName: "utilities") pod "218365ab-603a-4592-a6fd-76b3a26f79d6" (UID: "218365ab-603a-4592-a6fd-76b3a26f79d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.110991 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218365ab-603a-4592-a6fd-76b3a26f79d6-kube-api-access-dknbf" (OuterVolumeSpecName: "kube-api-access-dknbf") pod "218365ab-603a-4592-a6fd-76b3a26f79d6" (UID: "218365ab-603a-4592-a6fd-76b3a26f79d6"). InnerVolumeSpecName "kube-api-access-dknbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.128908 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/218365ab-603a-4592-a6fd-76b3a26f79d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "218365ab-603a-4592-a6fd-76b3a26f79d6" (UID: "218365ab-603a-4592-a6fd-76b3a26f79d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.206086 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dknbf\" (UniqueName: \"kubernetes.io/projected/218365ab-603a-4592-a6fd-76b3a26f79d6-kube-api-access-dknbf\") on node \"crc\" DevicePath \"\"" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.206306 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218365ab-603a-4592-a6fd-76b3a26f79d6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.206444 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218365ab-603a-4592-a6fd-76b3a26f79d6-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.499739 4992 generic.go:334] "Generic (PLEG): container finished" podID="218365ab-603a-4592-a6fd-76b3a26f79d6" containerID="f82b0a2fafb922d0f36ac676e8c5e675d915dd910c8d70548b8f69c33c836ea6" exitCode=0 Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.499831 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2dw2c" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.499827 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dw2c" event={"ID":"218365ab-603a-4592-a6fd-76b3a26f79d6","Type":"ContainerDied","Data":"f82b0a2fafb922d0f36ac676e8c5e675d915dd910c8d70548b8f69c33c836ea6"} Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.501844 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2dw2c" event={"ID":"218365ab-603a-4592-a6fd-76b3a26f79d6","Type":"ContainerDied","Data":"37e23bed2d2d67d7d767f69fb529c2e9946c43b4e9245e1b934cf6a0a18d47d5"} Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.501874 4992 scope.go:117] "RemoveContainer" containerID="f82b0a2fafb922d0f36ac676e8c5e675d915dd910c8d70548b8f69c33c836ea6" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.550746 4992 scope.go:117] "RemoveContainer" containerID="884853524ca6041425cc8d6aba8815e88ffd5d5022a83f41359ab1eade117403" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.559881 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dw2c"] Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.575629 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2dw2c"] Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.588525 4992 scope.go:117] "RemoveContainer" containerID="10f1ed05476b7cf9a32e04caa19381dd52b2caf1af181344dea2e516dc254b81" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.636950 4992 scope.go:117] "RemoveContainer" containerID="f82b0a2fafb922d0f36ac676e8c5e675d915dd910c8d70548b8f69c33c836ea6" Jan 31 10:31:12 crc kubenswrapper[4992]: E0131 10:31:12.637598 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f82b0a2fafb922d0f36ac676e8c5e675d915dd910c8d70548b8f69c33c836ea6\": container with ID starting with f82b0a2fafb922d0f36ac676e8c5e675d915dd910c8d70548b8f69c33c836ea6 not found: ID does not exist" containerID="f82b0a2fafb922d0f36ac676e8c5e675d915dd910c8d70548b8f69c33c836ea6" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.637688 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f82b0a2fafb922d0f36ac676e8c5e675d915dd910c8d70548b8f69c33c836ea6"} err="failed to get container status \"f82b0a2fafb922d0f36ac676e8c5e675d915dd910c8d70548b8f69c33c836ea6\": rpc error: code = NotFound desc = could not find container \"f82b0a2fafb922d0f36ac676e8c5e675d915dd910c8d70548b8f69c33c836ea6\": container with ID starting with f82b0a2fafb922d0f36ac676e8c5e675d915dd910c8d70548b8f69c33c836ea6 not found: ID does not exist" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.637749 4992 scope.go:117] "RemoveContainer" containerID="884853524ca6041425cc8d6aba8815e88ffd5d5022a83f41359ab1eade117403" Jan 31 10:31:12 crc kubenswrapper[4992]: E0131 10:31:12.638219 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"884853524ca6041425cc8d6aba8815e88ffd5d5022a83f41359ab1eade117403\": container with ID starting with 884853524ca6041425cc8d6aba8815e88ffd5d5022a83f41359ab1eade117403 not found: ID does not exist" containerID="884853524ca6041425cc8d6aba8815e88ffd5d5022a83f41359ab1eade117403" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.638247 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"884853524ca6041425cc8d6aba8815e88ffd5d5022a83f41359ab1eade117403"} err="failed to get container status \"884853524ca6041425cc8d6aba8815e88ffd5d5022a83f41359ab1eade117403\": rpc error: code = NotFound desc = could not find container \"884853524ca6041425cc8d6aba8815e88ffd5d5022a83f41359ab1eade117403\": container with ID starting with 884853524ca6041425cc8d6aba8815e88ffd5d5022a83f41359ab1eade117403 not found: ID does not exist" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.638272 4992 scope.go:117] "RemoveContainer" containerID="10f1ed05476b7cf9a32e04caa19381dd52b2caf1af181344dea2e516dc254b81" Jan 31 10:31:12 crc kubenswrapper[4992]: E0131 10:31:12.638698 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10f1ed05476b7cf9a32e04caa19381dd52b2caf1af181344dea2e516dc254b81\": container with ID starting with 10f1ed05476b7cf9a32e04caa19381dd52b2caf1af181344dea2e516dc254b81 not found: ID does not exist" containerID="10f1ed05476b7cf9a32e04caa19381dd52b2caf1af181344dea2e516dc254b81" Jan 31 10:31:12 crc kubenswrapper[4992]: I0131 10:31:12.638839 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10f1ed05476b7cf9a32e04caa19381dd52b2caf1af181344dea2e516dc254b81"} err="failed to get container status \"10f1ed05476b7cf9a32e04caa19381dd52b2caf1af181344dea2e516dc254b81\": rpc error: code = NotFound desc = could not find container \"10f1ed05476b7cf9a32e04caa19381dd52b2caf1af181344dea2e516dc254b81\": container with ID starting with 10f1ed05476b7cf9a32e04caa19381dd52b2caf1af181344dea2e516dc254b81 not found: ID does not exist" Jan 31 10:31:13 crc kubenswrapper[4992]: I0131 10:31:13.201529 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="218365ab-603a-4592-a6fd-76b3a26f79d6" path="/var/lib/kubelet/pods/218365ab-603a-4592-a6fd-76b3a26f79d6/volumes" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.183320 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.379220 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-47q2n"] Jan 31 10:31:16 crc kubenswrapper[4992]: E0131 10:31:16.381135 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218365ab-603a-4592-a6fd-76b3a26f79d6" containerName="extract-utilities" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.381190 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="218365ab-603a-4592-a6fd-76b3a26f79d6" containerName="extract-utilities" Jan 31 10:31:16 crc kubenswrapper[4992]: E0131 10:31:16.381219 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218365ab-603a-4592-a6fd-76b3a26f79d6" containerName="extract-content" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.381231 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="218365ab-603a-4592-a6fd-76b3a26f79d6" containerName="extract-content" Jan 31 10:31:16 crc kubenswrapper[4992]: E0131 10:31:16.381275 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218365ab-603a-4592-a6fd-76b3a26f79d6" containerName="registry-server" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.381288 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="218365ab-603a-4592-a6fd-76b3a26f79d6" containerName="registry-server" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.382168 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="218365ab-603a-4592-a6fd-76b3a26f79d6" containerName="registry-server" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.386338 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.412486 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47q2n"] Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.504598 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0869fda5-e906-43c4-8a30-d0c5b62f12b6-catalog-content\") pod \"community-operators-47q2n\" (UID: \"0869fda5-e906-43c4-8a30-d0c5b62f12b6\") " pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.504738 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcmdq\" (UniqueName: \"kubernetes.io/projected/0869fda5-e906-43c4-8a30-d0c5b62f12b6-kube-api-access-pcmdq\") pod \"community-operators-47q2n\" (UID: \"0869fda5-e906-43c4-8a30-d0c5b62f12b6\") " pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.504771 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0869fda5-e906-43c4-8a30-d0c5b62f12b6-utilities\") pod \"community-operators-47q2n\" (UID: \"0869fda5-e906-43c4-8a30-d0c5b62f12b6\") " pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.543131 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"0a0e3642d17199772632eec18bcc072f796c75cfc96e2995d9dbdd6cc4109275"} Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.606747 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0869fda5-e906-43c4-8a30-d0c5b62f12b6-catalog-content\") pod \"community-operators-47q2n\" (UID: \"0869fda5-e906-43c4-8a30-d0c5b62f12b6\") " pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.607367 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcmdq\" (UniqueName: \"kubernetes.io/projected/0869fda5-e906-43c4-8a30-d0c5b62f12b6-kube-api-access-pcmdq\") pod \"community-operators-47q2n\" (UID: \"0869fda5-e906-43c4-8a30-d0c5b62f12b6\") " pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.607783 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0869fda5-e906-43c4-8a30-d0c5b62f12b6-utilities\") pod \"community-operators-47q2n\" (UID: \"0869fda5-e906-43c4-8a30-d0c5b62f12b6\") " pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.607508 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0869fda5-e906-43c4-8a30-d0c5b62f12b6-catalog-content\") pod \"community-operators-47q2n\" (UID: \"0869fda5-e906-43c4-8a30-d0c5b62f12b6\") " pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.608128 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0869fda5-e906-43c4-8a30-d0c5b62f12b6-utilities\") pod \"community-operators-47q2n\" (UID: \"0869fda5-e906-43c4-8a30-d0c5b62f12b6\") " pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.627470 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcmdq\" (UniqueName: \"kubernetes.io/projected/0869fda5-e906-43c4-8a30-d0c5b62f12b6-kube-api-access-pcmdq\") pod \"community-operators-47q2n\" (UID: \"0869fda5-e906-43c4-8a30-d0c5b62f12b6\") " pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:16 crc kubenswrapper[4992]: I0131 10:31:16.722293 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:17 crc kubenswrapper[4992]: I0131 10:31:17.286012 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47q2n"] Jan 31 10:31:17 crc kubenswrapper[4992]: W0131 10:31:17.300703 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0869fda5_e906_43c4_8a30_d0c5b62f12b6.slice/crio-4042059f824523f5bac6da51cf0003b80bf2d6213126563da278dca2c97e985e WatchSource:0}: Error finding container 4042059f824523f5bac6da51cf0003b80bf2d6213126563da278dca2c97e985e: Status 404 returned error can't find the container with id 4042059f824523f5bac6da51cf0003b80bf2d6213126563da278dca2c97e985e Jan 31 10:31:19 crc kubenswrapper[4992]: I0131 10:31:19.026353 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47q2n" event={"ID":"0869fda5-e906-43c4-8a30-d0c5b62f12b6","Type":"ContainerStarted","Data":"4042059f824523f5bac6da51cf0003b80bf2d6213126563da278dca2c97e985e"} Jan 31 10:31:20 crc kubenswrapper[4992]: I0131 10:31:20.038881 4992 generic.go:334] "Generic (PLEG): container finished" podID="0869fda5-e906-43c4-8a30-d0c5b62f12b6" containerID="61b164b969e19dee357f170379438a9615291de3226eb0d4f38468a62c37096c" exitCode=0 Jan 31 10:31:20 crc kubenswrapper[4992]: I0131 10:31:20.038944 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47q2n" event={"ID":"0869fda5-e906-43c4-8a30-d0c5b62f12b6","Type":"ContainerDied","Data":"61b164b969e19dee357f170379438a9615291de3226eb0d4f38468a62c37096c"} Jan 31 10:31:21 crc kubenswrapper[4992]: I0131 10:31:21.050813 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47q2n" event={"ID":"0869fda5-e906-43c4-8a30-d0c5b62f12b6","Type":"ContainerStarted","Data":"a3ed5981f6bfe55452f24752518db4f110b94f4363c9dcecddefe585fa50e597"} Jan 31 10:31:22 crc kubenswrapper[4992]: I0131 10:31:22.062092 4992 generic.go:334] "Generic (PLEG): container finished" podID="0869fda5-e906-43c4-8a30-d0c5b62f12b6" containerID="a3ed5981f6bfe55452f24752518db4f110b94f4363c9dcecddefe585fa50e597" exitCode=0 Jan 31 10:31:22 crc kubenswrapper[4992]: I0131 10:31:22.062182 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47q2n" event={"ID":"0869fda5-e906-43c4-8a30-d0c5b62f12b6","Type":"ContainerDied","Data":"a3ed5981f6bfe55452f24752518db4f110b94f4363c9dcecddefe585fa50e597"} Jan 31 10:31:23 crc kubenswrapper[4992]: I0131 10:31:23.073152 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47q2n" event={"ID":"0869fda5-e906-43c4-8a30-d0c5b62f12b6","Type":"ContainerStarted","Data":"533dfcd4e9c8f5764b23bc3384cdd5c2c092274aea97a351805a22a653d3097b"} Jan 31 10:31:23 crc kubenswrapper[4992]: I0131 10:31:23.103503 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-47q2n" podStartSLOduration=4.655065383 podStartE2EDuration="7.103482317s" podCreationTimestamp="2026-01-31 10:31:16 +0000 UTC" firstStartedPulling="2026-01-31 10:31:20.040551181 +0000 UTC m=+3976.011943168" lastFinishedPulling="2026-01-31 10:31:22.488968075 +0000 UTC m=+3978.460360102" observedRunningTime="2026-01-31 10:31:23.097855997 +0000 UTC m=+3979.069248004" watchObservedRunningTime="2026-01-31 10:31:23.103482317 +0000 UTC m=+3979.074874314" Jan 31 10:31:26 crc kubenswrapper[4992]: I0131 10:31:26.722888 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:26 crc kubenswrapper[4992]: I0131 10:31:26.723572 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:26 crc kubenswrapper[4992]: I0131 10:31:26.805373 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:27 crc kubenswrapper[4992]: I0131 10:31:27.180707 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:27 crc kubenswrapper[4992]: I0131 10:31:27.235745 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47q2n"] Jan 31 10:31:29 crc kubenswrapper[4992]: I0131 10:31:29.132484 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-47q2n" podUID="0869fda5-e906-43c4-8a30-d0c5b62f12b6" containerName="registry-server" containerID="cri-o://533dfcd4e9c8f5764b23bc3384cdd5c2c092274aea97a351805a22a653d3097b" gracePeriod=2 Jan 31 10:31:29 crc kubenswrapper[4992]: I0131 10:31:29.591311 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:29 crc kubenswrapper[4992]: I0131 10:31:29.700660 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0869fda5-e906-43c4-8a30-d0c5b62f12b6-catalog-content\") pod \"0869fda5-e906-43c4-8a30-d0c5b62f12b6\" (UID: \"0869fda5-e906-43c4-8a30-d0c5b62f12b6\") " Jan 31 10:31:29 crc kubenswrapper[4992]: I0131 10:31:29.700851 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcmdq\" (UniqueName: \"kubernetes.io/projected/0869fda5-e906-43c4-8a30-d0c5b62f12b6-kube-api-access-pcmdq\") pod \"0869fda5-e906-43c4-8a30-d0c5b62f12b6\" (UID: \"0869fda5-e906-43c4-8a30-d0c5b62f12b6\") " Jan 31 10:31:29 crc kubenswrapper[4992]: I0131 10:31:29.700919 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0869fda5-e906-43c4-8a30-d0c5b62f12b6-utilities\") pod \"0869fda5-e906-43c4-8a30-d0c5b62f12b6\" (UID: \"0869fda5-e906-43c4-8a30-d0c5b62f12b6\") " Jan 31 10:31:29 crc kubenswrapper[4992]: I0131 10:31:29.702199 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0869fda5-e906-43c4-8a30-d0c5b62f12b6-utilities" (OuterVolumeSpecName: "utilities") pod "0869fda5-e906-43c4-8a30-d0c5b62f12b6" (UID: "0869fda5-e906-43c4-8a30-d0c5b62f12b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:31:29 crc kubenswrapper[4992]: I0131 10:31:29.711476 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0869fda5-e906-43c4-8a30-d0c5b62f12b6-kube-api-access-pcmdq" (OuterVolumeSpecName: "kube-api-access-pcmdq") pod "0869fda5-e906-43c4-8a30-d0c5b62f12b6" (UID: "0869fda5-e906-43c4-8a30-d0c5b62f12b6"). InnerVolumeSpecName "kube-api-access-pcmdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:31:29 crc kubenswrapper[4992]: I0131 10:31:29.769706 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0869fda5-e906-43c4-8a30-d0c5b62f12b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0869fda5-e906-43c4-8a30-d0c5b62f12b6" (UID: "0869fda5-e906-43c4-8a30-d0c5b62f12b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:31:29 crc kubenswrapper[4992]: I0131 10:31:29.803186 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcmdq\" (UniqueName: \"kubernetes.io/projected/0869fda5-e906-43c4-8a30-d0c5b62f12b6-kube-api-access-pcmdq\") on node \"crc\" DevicePath \"\"" Jan 31 10:31:29 crc kubenswrapper[4992]: I0131 10:31:29.803242 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0869fda5-e906-43c4-8a30-d0c5b62f12b6-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:31:29 crc kubenswrapper[4992]: I0131 10:31:29.803258 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0869fda5-e906-43c4-8a30-d0c5b62f12b6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:31:30 crc kubenswrapper[4992]: I0131 10:31:30.147813 4992 generic.go:334] "Generic (PLEG): container finished" podID="0869fda5-e906-43c4-8a30-d0c5b62f12b6" containerID="533dfcd4e9c8f5764b23bc3384cdd5c2c092274aea97a351805a22a653d3097b" exitCode=0 Jan 31 10:31:30 crc kubenswrapper[4992]: I0131 10:31:30.147871 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47q2n" event={"ID":"0869fda5-e906-43c4-8a30-d0c5b62f12b6","Type":"ContainerDied","Data":"533dfcd4e9c8f5764b23bc3384cdd5c2c092274aea97a351805a22a653d3097b"} Jan 31 10:31:30 crc kubenswrapper[4992]: I0131 10:31:30.147922 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47q2n" Jan 31 10:31:30 crc kubenswrapper[4992]: I0131 10:31:30.147943 4992 scope.go:117] "RemoveContainer" containerID="533dfcd4e9c8f5764b23bc3384cdd5c2c092274aea97a351805a22a653d3097b" Jan 31 10:31:30 crc kubenswrapper[4992]: I0131 10:31:30.147929 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47q2n" event={"ID":"0869fda5-e906-43c4-8a30-d0c5b62f12b6","Type":"ContainerDied","Data":"4042059f824523f5bac6da51cf0003b80bf2d6213126563da278dca2c97e985e"} Jan 31 10:31:30 crc kubenswrapper[4992]: I0131 10:31:30.203029 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47q2n"] Jan 31 10:31:30 crc kubenswrapper[4992]: I0131 10:31:30.206593 4992 scope.go:117] "RemoveContainer" containerID="a3ed5981f6bfe55452f24752518db4f110b94f4363c9dcecddefe585fa50e597" Jan 31 10:31:30 crc kubenswrapper[4992]: I0131 10:31:30.212047 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-47q2n"] Jan 31 10:31:30 crc kubenswrapper[4992]: I0131 10:31:30.230469 4992 scope.go:117] "RemoveContainer" containerID="61b164b969e19dee357f170379438a9615291de3226eb0d4f38468a62c37096c" Jan 31 10:31:30 crc kubenswrapper[4992]: I0131 10:31:30.296583 4992 scope.go:117] "RemoveContainer" containerID="533dfcd4e9c8f5764b23bc3384cdd5c2c092274aea97a351805a22a653d3097b" Jan 31 10:31:30 crc kubenswrapper[4992]: E0131 10:31:30.297168 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"533dfcd4e9c8f5764b23bc3384cdd5c2c092274aea97a351805a22a653d3097b\": container with ID starting with 533dfcd4e9c8f5764b23bc3384cdd5c2c092274aea97a351805a22a653d3097b not found: ID does not exist" containerID="533dfcd4e9c8f5764b23bc3384cdd5c2c092274aea97a351805a22a653d3097b" Jan 31 10:31:30 crc kubenswrapper[4992]: I0131 10:31:30.297242 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533dfcd4e9c8f5764b23bc3384cdd5c2c092274aea97a351805a22a653d3097b"} err="failed to get container status \"533dfcd4e9c8f5764b23bc3384cdd5c2c092274aea97a351805a22a653d3097b\": rpc error: code = NotFound desc = could not find container \"533dfcd4e9c8f5764b23bc3384cdd5c2c092274aea97a351805a22a653d3097b\": container with ID starting with 533dfcd4e9c8f5764b23bc3384cdd5c2c092274aea97a351805a22a653d3097b not found: ID does not exist" Jan 31 10:31:30 crc kubenswrapper[4992]: I0131 10:31:30.297288 4992 scope.go:117] "RemoveContainer" containerID="a3ed5981f6bfe55452f24752518db4f110b94f4363c9dcecddefe585fa50e597" Jan 31 10:31:30 crc kubenswrapper[4992]: E0131 10:31:30.297907 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3ed5981f6bfe55452f24752518db4f110b94f4363c9dcecddefe585fa50e597\": container with ID starting with a3ed5981f6bfe55452f24752518db4f110b94f4363c9dcecddefe585fa50e597 not found: ID does not exist" containerID="a3ed5981f6bfe55452f24752518db4f110b94f4363c9dcecddefe585fa50e597" Jan 31 10:31:30 crc kubenswrapper[4992]: I0131 10:31:30.298029 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3ed5981f6bfe55452f24752518db4f110b94f4363c9dcecddefe585fa50e597"} err="failed to get container status \"a3ed5981f6bfe55452f24752518db4f110b94f4363c9dcecddefe585fa50e597\": rpc error: code = NotFound desc = could not find container \"a3ed5981f6bfe55452f24752518db4f110b94f4363c9dcecddefe585fa50e597\": container with ID starting with a3ed5981f6bfe55452f24752518db4f110b94f4363c9dcecddefe585fa50e597 not found: ID does not exist" Jan 31 10:31:30 crc kubenswrapper[4992]: I0131 10:31:30.298124 4992 scope.go:117] "RemoveContainer" containerID="61b164b969e19dee357f170379438a9615291de3226eb0d4f38468a62c37096c" Jan 31 10:31:30 crc kubenswrapper[4992]: E0131 10:31:30.299869 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61b164b969e19dee357f170379438a9615291de3226eb0d4f38468a62c37096c\": container with ID starting with 61b164b969e19dee357f170379438a9615291de3226eb0d4f38468a62c37096c not found: ID does not exist" containerID="61b164b969e19dee357f170379438a9615291de3226eb0d4f38468a62c37096c" Jan 31 10:31:30 crc kubenswrapper[4992]: I0131 10:31:30.299933 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b164b969e19dee357f170379438a9615291de3226eb0d4f38468a62c37096c"} err="failed to get container status \"61b164b969e19dee357f170379438a9615291de3226eb0d4f38468a62c37096c\": rpc error: code = NotFound desc = could not find container \"61b164b969e19dee357f170379438a9615291de3226eb0d4f38468a62c37096c\": container with ID starting with 61b164b969e19dee357f170379438a9615291de3226eb0d4f38468a62c37096c not found: ID does not exist" Jan 31 10:31:31 crc kubenswrapper[4992]: I0131 10:31:31.205348 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0869fda5-e906-43c4-8a30-d0c5b62f12b6" path="/var/lib/kubelet/pods/0869fda5-e906-43c4-8a30-d0c5b62f12b6/volumes" Jan 31 10:31:54 crc kubenswrapper[4992]: I0131 10:31:54.411566 4992 generic.go:334] "Generic (PLEG): container finished" podID="b763b768-dbea-43f3-a06b-b773c6332ea5" containerID="14034fd76f71a456644686aaee78e97a61f7b82836ded31359ce68e63f2ef149" exitCode=1 Jan 31 10:31:54 crc kubenswrapper[4992]: I0131 10:31:54.411680 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"b763b768-dbea-43f3-a06b-b773c6332ea5","Type":"ContainerDied","Data":"14034fd76f71a456644686aaee78e97a61f7b82836ded31359ce68e63f2ef149"} Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.087741 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.218504 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ca-certs\") pod \"b763b768-dbea-43f3-a06b-b773c6332ea5\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.224284 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b763b768-dbea-43f3-a06b-b773c6332ea5-config-data\") pod \"b763b768-dbea-43f3-a06b-b773c6332ea5\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.224412 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ssh-key\") pod \"b763b768-dbea-43f3-a06b-b773c6332ea5\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.224483 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b763b768-dbea-43f3-a06b-b773c6332ea5-test-operator-ephemeral-workdir\") pod \"b763b768-dbea-43f3-a06b-b773c6332ea5\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.224539 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-openstack-config-secret\") pod \"b763b768-dbea-43f3-a06b-b773c6332ea5\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.224580 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b763b768-dbea-43f3-a06b-b773c6332ea5-openstack-config\") pod \"b763b768-dbea-43f3-a06b-b773c6332ea5\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.224661 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ceph\") pod \"b763b768-dbea-43f3-a06b-b773c6332ea5\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.224706 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"b763b768-dbea-43f3-a06b-b773c6332ea5\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.224759 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b763b768-dbea-43f3-a06b-b773c6332ea5-test-operator-ephemeral-temporary\") pod \"b763b768-dbea-43f3-a06b-b773c6332ea5\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.224840 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhrgg\" (UniqueName: \"kubernetes.io/projected/b763b768-dbea-43f3-a06b-b773c6332ea5-kube-api-access-fhrgg\") pod \"b763b768-dbea-43f3-a06b-b773c6332ea5\" (UID: \"b763b768-dbea-43f3-a06b-b773c6332ea5\") " Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.226357 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b763b768-dbea-43f3-a06b-b773c6332ea5-config-data" (OuterVolumeSpecName: "config-data") pod "b763b768-dbea-43f3-a06b-b773c6332ea5" (UID: "b763b768-dbea-43f3-a06b-b773c6332ea5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.230464 4992 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b763b768-dbea-43f3-a06b-b773c6332ea5-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.231850 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b763b768-dbea-43f3-a06b-b773c6332ea5-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "b763b768-dbea-43f3-a06b-b773c6332ea5" (UID: "b763b768-dbea-43f3-a06b-b773c6332ea5"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.232006 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b763b768-dbea-43f3-a06b-b773c6332ea5-kube-api-access-fhrgg" (OuterVolumeSpecName: "kube-api-access-fhrgg") pod "b763b768-dbea-43f3-a06b-b773c6332ea5" (UID: "b763b768-dbea-43f3-a06b-b773c6332ea5"). InnerVolumeSpecName "kube-api-access-fhrgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.234148 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ceph" (OuterVolumeSpecName: "ceph") pod "b763b768-dbea-43f3-a06b-b773c6332ea5" (UID: "b763b768-dbea-43f3-a06b-b773c6332ea5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.237392 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b763b768-dbea-43f3-a06b-b773c6332ea5-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "b763b768-dbea-43f3-a06b-b773c6332ea5" (UID: "b763b768-dbea-43f3-a06b-b773c6332ea5"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.249353 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "b763b768-dbea-43f3-a06b-b773c6332ea5" (UID: "b763b768-dbea-43f3-a06b-b773c6332ea5"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.259608 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b763b768-dbea-43f3-a06b-b773c6332ea5" (UID: "b763b768-dbea-43f3-a06b-b773c6332ea5"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.265126 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b763b768-dbea-43f3-a06b-b773c6332ea5" (UID: "b763b768-dbea-43f3-a06b-b773c6332ea5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.280076 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "b763b768-dbea-43f3-a06b-b773c6332ea5" (UID: "b763b768-dbea-43f3-a06b-b773c6332ea5"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.285592 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b763b768-dbea-43f3-a06b-b773c6332ea5-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b763b768-dbea-43f3-a06b-b773c6332ea5" (UID: "b763b768-dbea-43f3-a06b-b773c6332ea5"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.331833 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.331878 4992 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.331890 4992 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b763b768-dbea-43f3-a06b-b773c6332ea5-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.331899 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhrgg\" (UniqueName: \"kubernetes.io/projected/b763b768-dbea-43f3-a06b-b773c6332ea5-kube-api-access-fhrgg\") on node \"crc\" DevicePath \"\"" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.331911 4992 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.331918 4992 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.331927 4992 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b763b768-dbea-43f3-a06b-b773c6332ea5-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.331936 4992 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b763b768-dbea-43f3-a06b-b773c6332ea5-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.331946 4992 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b763b768-dbea-43f3-a06b-b773c6332ea5-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.353138 4992 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.430594 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest-s01-single-test" event={"ID":"b763b768-dbea-43f3-a06b-b773c6332ea5","Type":"ContainerDied","Data":"d33632735864591095b008b633ad5ae73592f85cb7aa5fe1f76ebe699f63edae"} Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.430640 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d33632735864591095b008b633ad5ae73592f85cb7aa5fe1f76ebe699f63edae" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.430715 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest-s01-single-test" Jan 31 10:31:56 crc kubenswrapper[4992]: I0131 10:31:56.435593 4992 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.284769 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 31 10:32:08 crc kubenswrapper[4992]: E0131 10:32:08.286291 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0869fda5-e906-43c4-8a30-d0c5b62f12b6" containerName="extract-utilities" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.286321 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="0869fda5-e906-43c4-8a30-d0c5b62f12b6" containerName="extract-utilities" Jan 31 10:32:08 crc kubenswrapper[4992]: E0131 10:32:08.286350 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b763b768-dbea-43f3-a06b-b773c6332ea5" containerName="tempest-tests-tempest-tests-runner" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.286366 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="b763b768-dbea-43f3-a06b-b773c6332ea5" containerName="tempest-tests-tempest-tests-runner" Jan 31 10:32:08 crc kubenswrapper[4992]: E0131 10:32:08.286409 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0869fda5-e906-43c4-8a30-d0c5b62f12b6" containerName="extract-content" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.286461 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="0869fda5-e906-43c4-8a30-d0c5b62f12b6" containerName="extract-content" Jan 31 10:32:08 crc kubenswrapper[4992]: E0131 10:32:08.286494 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0869fda5-e906-43c4-8a30-d0c5b62f12b6" containerName="registry-server" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.286509 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="0869fda5-e906-43c4-8a30-d0c5b62f12b6" containerName="registry-server" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.286948 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="b763b768-dbea-43f3-a06b-b773c6332ea5" containerName="tempest-tests-tempest-tests-runner" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.287007 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="0869fda5-e906-43c4-8a30-d0c5b62f12b6" containerName="registry-server" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.288327 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.292100 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bsch7" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.299362 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.423094 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a465f843-8918-48c6-899b-77cc07e022f0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.423535 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v64f8\" (UniqueName: \"kubernetes.io/projected/a465f843-8918-48c6-899b-77cc07e022f0-kube-api-access-v64f8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a465f843-8918-48c6-899b-77cc07e022f0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.525593 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a465f843-8918-48c6-899b-77cc07e022f0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.525747 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v64f8\" (UniqueName: \"kubernetes.io/projected/a465f843-8918-48c6-899b-77cc07e022f0-kube-api-access-v64f8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a465f843-8918-48c6-899b-77cc07e022f0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.526320 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a465f843-8918-48c6-899b-77cc07e022f0\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.567888 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v64f8\" (UniqueName: \"kubernetes.io/projected/a465f843-8918-48c6-899b-77cc07e022f0-kube-api-access-v64f8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a465f843-8918-48c6-899b-77cc07e022f0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.582568 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a465f843-8918-48c6-899b-77cc07e022f0\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 10:32:08 crc kubenswrapper[4992]: I0131 10:32:08.632412 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 31 10:32:09 crc kubenswrapper[4992]: I0131 10:32:09.162688 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 31 10:32:09 crc kubenswrapper[4992]: I0131 10:32:09.587166 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"a465f843-8918-48c6-899b-77cc07e022f0","Type":"ContainerStarted","Data":"7d53183dd37bef7568e8030d71a9decd1e90967acd4926bf932ac4d8bebbcb4c"} Jan 31 10:32:10 crc kubenswrapper[4992]: I0131 10:32:10.604405 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"a465f843-8918-48c6-899b-77cc07e022f0","Type":"ContainerStarted","Data":"463aa1cc9d53dac0f92b0cb33e2b67d0f36373095970e5479fc9edd7ed4d1298"} Jan 31 10:32:10 crc kubenswrapper[4992]: I0131 10:32:10.631783 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.8100314229999999 podStartE2EDuration="2.631761544s" podCreationTimestamp="2026-01-31 10:32:08 +0000 UTC" firstStartedPulling="2026-01-31 10:32:09.166101233 +0000 UTC m=+4025.137493220" lastFinishedPulling="2026-01-31 10:32:09.987831344 +0000 UTC m=+4025.959223341" observedRunningTime="2026-01-31 10:32:10.624017254 +0000 UTC m=+4026.595409301" watchObservedRunningTime="2026-01-31 10:32:10.631761544 +0000 UTC m=+4026.603153541" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.268211 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tobiko-tests-tobiko-s00-podified-functional"] Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.271204 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.274154 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobikotobiko-config" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.274234 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"tobiko-secret" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.274896 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobikotobiko-public-key" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.275365 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tobiko-tests-tobikotobiko-private-key" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.289729 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"test-operator-clouds-config" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.300988 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s00-podified-functional"] Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.458096 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vlxv\" (UniqueName: \"kubernetes.io/projected/974de630-b062-43d4-825a-af00b5f4ba2f-kube-api-access-4vlxv\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.458169 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.458196 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-openstack-config-secret\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.458312 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.458343 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-public-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.458389 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-ca-certs\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.458452 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-private-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.458496 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.458522 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-kubeconfig\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.458880 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.458941 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-ceph\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.458974 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.561025 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.561082 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-openstack-config-secret\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.561148 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.561171 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-public-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.561208 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-ca-certs\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.561247 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-private-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.561291 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.561328 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-kubeconfig\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.561486 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.561521 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.561550 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-ceph\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.561634 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vlxv\" (UniqueName: \"kubernetes.io/projected/974de630-b062-43d4-825a-af00b5f4ba2f-kube-api-access-4vlxv\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.562310 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.562360 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-public-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.562559 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-private-key\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.562672 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.563019 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.563376 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.563549 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.567296 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-ca-certs\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.567406 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-kubeconfig\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.569247 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-openstack-config-secret\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.571533 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-ceph\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.582385 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vlxv\" (UniqueName: \"kubernetes.io/projected/974de630-b062-43d4-825a-af00b5f4ba2f-kube-api-access-4vlxv\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.601283 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tobiko-tests-tobiko-s00-podified-functional\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:26 crc kubenswrapper[4992]: I0131 10:32:26.654485 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:32:27 crc kubenswrapper[4992]: I0131 10:32:27.247831 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s00-podified-functional"] Jan 31 10:32:27 crc kubenswrapper[4992]: I0131 10:32:27.788077 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"974de630-b062-43d4-825a-af00b5f4ba2f","Type":"ContainerStarted","Data":"6f524cd5ccafdeaa2885dcda6e9d16ef313224a0d7cecdc13f3b25bcfa641db9"} Jan 31 10:32:42 crc kubenswrapper[4992]: I0131 10:32:42.927943 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"974de630-b062-43d4-825a-af00b5f4ba2f","Type":"ContainerStarted","Data":"069014b67fe9f6176f59fa3c80a31e7fbf72fe9f46c999b342e9198e95dfd24b"} Jan 31 10:32:42 crc kubenswrapper[4992]: I0131 10:32:42.981264 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" podStartSLOduration=3.930713446 podStartE2EDuration="17.981240834s" podCreationTimestamp="2026-01-31 10:32:25 +0000 UTC" firstStartedPulling="2026-01-31 10:32:27.250641892 +0000 UTC m=+4043.222033889" lastFinishedPulling="2026-01-31 10:32:41.30116925 +0000 UTC m=+4057.272561277" observedRunningTime="2026-01-31 10:32:42.969788978 +0000 UTC m=+4058.941181005" watchObservedRunningTime="2026-01-31 10:32:42.981240834 +0000 UTC m=+4058.952632821" Jan 31 10:33:07 crc kubenswrapper[4992]: I0131 10:33:07.075044 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wq6k5"] Jan 31 10:33:07 crc kubenswrapper[4992]: I0131 10:33:07.082754 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:07 crc kubenswrapper[4992]: I0131 10:33:07.108372 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wq6k5"] Jan 31 10:33:07 crc kubenswrapper[4992]: I0131 10:33:07.184008 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31552c6a-46c7-478e-ac86-2c68599ca495-catalog-content\") pod \"certified-operators-wq6k5\" (UID: \"31552c6a-46c7-478e-ac86-2c68599ca495\") " pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:07 crc kubenswrapper[4992]: I0131 10:33:07.184179 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31552c6a-46c7-478e-ac86-2c68599ca495-utilities\") pod \"certified-operators-wq6k5\" (UID: \"31552c6a-46c7-478e-ac86-2c68599ca495\") " pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:07 crc kubenswrapper[4992]: I0131 10:33:07.184246 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkf77\" (UniqueName: \"kubernetes.io/projected/31552c6a-46c7-478e-ac86-2c68599ca495-kube-api-access-wkf77\") pod \"certified-operators-wq6k5\" (UID: \"31552c6a-46c7-478e-ac86-2c68599ca495\") " pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:07 crc kubenswrapper[4992]: I0131 10:33:07.286257 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31552c6a-46c7-478e-ac86-2c68599ca495-utilities\") pod \"certified-operators-wq6k5\" (UID: \"31552c6a-46c7-478e-ac86-2c68599ca495\") " pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:07 crc kubenswrapper[4992]: I0131 10:33:07.286382 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkf77\" (UniqueName: \"kubernetes.io/projected/31552c6a-46c7-478e-ac86-2c68599ca495-kube-api-access-wkf77\") pod \"certified-operators-wq6k5\" (UID: \"31552c6a-46c7-478e-ac86-2c68599ca495\") " pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:07 crc kubenswrapper[4992]: I0131 10:33:07.286462 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31552c6a-46c7-478e-ac86-2c68599ca495-catalog-content\") pod \"certified-operators-wq6k5\" (UID: \"31552c6a-46c7-478e-ac86-2c68599ca495\") " pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:07 crc kubenswrapper[4992]: I0131 10:33:07.287222 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31552c6a-46c7-478e-ac86-2c68599ca495-utilities\") pod \"certified-operators-wq6k5\" (UID: \"31552c6a-46c7-478e-ac86-2c68599ca495\") " pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:07 crc kubenswrapper[4992]: I0131 10:33:07.287891 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31552c6a-46c7-478e-ac86-2c68599ca495-catalog-content\") pod \"certified-operators-wq6k5\" (UID: \"31552c6a-46c7-478e-ac86-2c68599ca495\") " pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:07 crc kubenswrapper[4992]: I0131 10:33:07.370197 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkf77\" (UniqueName: \"kubernetes.io/projected/31552c6a-46c7-478e-ac86-2c68599ca495-kube-api-access-wkf77\") pod \"certified-operators-wq6k5\" (UID: \"31552c6a-46c7-478e-ac86-2c68599ca495\") " pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:07 crc kubenswrapper[4992]: I0131 10:33:07.409485 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:07 crc kubenswrapper[4992]: I0131 10:33:07.897710 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wq6k5"] Jan 31 10:33:08 crc kubenswrapper[4992]: I0131 10:33:08.186688 4992 generic.go:334] "Generic (PLEG): container finished" podID="31552c6a-46c7-478e-ac86-2c68599ca495" containerID="2e9e1ed653d6e628bf2f1aa8418d5d240b477f56c06febe1a20234fc890fb53d" exitCode=0 Jan 31 10:33:08 crc kubenswrapper[4992]: I0131 10:33:08.187730 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq6k5" event={"ID":"31552c6a-46c7-478e-ac86-2c68599ca495","Type":"ContainerDied","Data":"2e9e1ed653d6e628bf2f1aa8418d5d240b477f56c06febe1a20234fc890fb53d"} Jan 31 10:33:08 crc kubenswrapper[4992]: I0131 10:33:08.187907 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq6k5" event={"ID":"31552c6a-46c7-478e-ac86-2c68599ca495","Type":"ContainerStarted","Data":"a226c0492ec9a89441838c0a54c4b3158541d04fa0c99e8c05fd2188d6a1a46f"} Jan 31 10:33:10 crc kubenswrapper[4992]: I0131 10:33:10.215353 4992 generic.go:334] "Generic (PLEG): container finished" podID="31552c6a-46c7-478e-ac86-2c68599ca495" containerID="e9a3aca52f377bf39e6358957af9894880fe67e1bc5c05d9a44cbd9dffd8fa3c" exitCode=0 Jan 31 10:33:10 crc kubenswrapper[4992]: I0131 10:33:10.215792 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq6k5" event={"ID":"31552c6a-46c7-478e-ac86-2c68599ca495","Type":"ContainerDied","Data":"e9a3aca52f377bf39e6358957af9894880fe67e1bc5c05d9a44cbd9dffd8fa3c"} Jan 31 10:33:11 crc kubenswrapper[4992]: I0131 10:33:11.226670 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq6k5" event={"ID":"31552c6a-46c7-478e-ac86-2c68599ca495","Type":"ContainerStarted","Data":"5a489dd0bc193766b33d934e537b9bf65f0e8763df94e1e60e68828558c7e5c0"} Jan 31 10:33:11 crc kubenswrapper[4992]: I0131 10:33:11.272442 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wq6k5" podStartSLOduration=1.823919059 podStartE2EDuration="4.272398254s" podCreationTimestamp="2026-01-31 10:33:07 +0000 UTC" firstStartedPulling="2026-01-31 10:33:08.187852073 +0000 UTC m=+4084.159244060" lastFinishedPulling="2026-01-31 10:33:10.636331228 +0000 UTC m=+4086.607723255" observedRunningTime="2026-01-31 10:33:11.259907768 +0000 UTC m=+4087.231299825" watchObservedRunningTime="2026-01-31 10:33:11.272398254 +0000 UTC m=+4087.243790251" Jan 31 10:33:17 crc kubenswrapper[4992]: I0131 10:33:17.409878 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:17 crc kubenswrapper[4992]: I0131 10:33:17.410155 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:18 crc kubenswrapper[4992]: I0131 10:33:18.459415 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-wq6k5" podUID="31552c6a-46c7-478e-ac86-2c68599ca495" containerName="registry-server" probeResult="failure" output=< Jan 31 10:33:18 crc kubenswrapper[4992]: timeout: failed to connect service ":50051" within 1s Jan 31 10:33:18 crc kubenswrapper[4992]: > Jan 31 10:33:27 crc kubenswrapper[4992]: I0131 10:33:27.479133 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:27 crc kubenswrapper[4992]: I0131 10:33:27.538791 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:27 crc kubenswrapper[4992]: I0131 10:33:27.723183 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wq6k5"] Jan 31 10:33:29 crc kubenswrapper[4992]: I0131 10:33:29.412717 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wq6k5" podUID="31552c6a-46c7-478e-ac86-2c68599ca495" containerName="registry-server" containerID="cri-o://5a489dd0bc193766b33d934e537b9bf65f0e8763df94e1e60e68828558c7e5c0" gracePeriod=2 Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.164392 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.332122 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkf77\" (UniqueName: \"kubernetes.io/projected/31552c6a-46c7-478e-ac86-2c68599ca495-kube-api-access-wkf77\") pod \"31552c6a-46c7-478e-ac86-2c68599ca495\" (UID: \"31552c6a-46c7-478e-ac86-2c68599ca495\") " Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.332712 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31552c6a-46c7-478e-ac86-2c68599ca495-catalog-content\") pod \"31552c6a-46c7-478e-ac86-2c68599ca495\" (UID: \"31552c6a-46c7-478e-ac86-2c68599ca495\") " Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.332779 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31552c6a-46c7-478e-ac86-2c68599ca495-utilities\") pod \"31552c6a-46c7-478e-ac86-2c68599ca495\" (UID: \"31552c6a-46c7-478e-ac86-2c68599ca495\") " Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.333521 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31552c6a-46c7-478e-ac86-2c68599ca495-utilities" (OuterVolumeSpecName: "utilities") pod "31552c6a-46c7-478e-ac86-2c68599ca495" (UID: "31552c6a-46c7-478e-ac86-2c68599ca495"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.333818 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31552c6a-46c7-478e-ac86-2c68599ca495-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.340740 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31552c6a-46c7-478e-ac86-2c68599ca495-kube-api-access-wkf77" (OuterVolumeSpecName: "kube-api-access-wkf77") pod "31552c6a-46c7-478e-ac86-2c68599ca495" (UID: "31552c6a-46c7-478e-ac86-2c68599ca495"). InnerVolumeSpecName "kube-api-access-wkf77". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.399064 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31552c6a-46c7-478e-ac86-2c68599ca495-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31552c6a-46c7-478e-ac86-2c68599ca495" (UID: "31552c6a-46c7-478e-ac86-2c68599ca495"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.423558 4992 generic.go:334] "Generic (PLEG): container finished" podID="31552c6a-46c7-478e-ac86-2c68599ca495" containerID="5a489dd0bc193766b33d934e537b9bf65f0e8763df94e1e60e68828558c7e5c0" exitCode=0 Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.423617 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq6k5" event={"ID":"31552c6a-46c7-478e-ac86-2c68599ca495","Type":"ContainerDied","Data":"5a489dd0bc193766b33d934e537b9bf65f0e8763df94e1e60e68828558c7e5c0"} Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.423633 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq6k5" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.423665 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq6k5" event={"ID":"31552c6a-46c7-478e-ac86-2c68599ca495","Type":"ContainerDied","Data":"a226c0492ec9a89441838c0a54c4b3158541d04fa0c99e8c05fd2188d6a1a46f"} Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.423690 4992 scope.go:117] "RemoveContainer" containerID="5a489dd0bc193766b33d934e537b9bf65f0e8763df94e1e60e68828558c7e5c0" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.435854 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31552c6a-46c7-478e-ac86-2c68599ca495-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.435896 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkf77\" (UniqueName: \"kubernetes.io/projected/31552c6a-46c7-478e-ac86-2c68599ca495-kube-api-access-wkf77\") on node \"crc\" DevicePath \"\"" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.455915 4992 scope.go:117] "RemoveContainer" containerID="e9a3aca52f377bf39e6358957af9894880fe67e1bc5c05d9a44cbd9dffd8fa3c" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.455965 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wq6k5"] Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.464952 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wq6k5"] Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.480943 4992 scope.go:117] "RemoveContainer" containerID="2e9e1ed653d6e628bf2f1aa8418d5d240b477f56c06febe1a20234fc890fb53d" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.519612 4992 scope.go:117] "RemoveContainer" containerID="5a489dd0bc193766b33d934e537b9bf65f0e8763df94e1e60e68828558c7e5c0" Jan 31 10:33:30 crc kubenswrapper[4992]: E0131 10:33:30.520081 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a489dd0bc193766b33d934e537b9bf65f0e8763df94e1e60e68828558c7e5c0\": container with ID starting with 5a489dd0bc193766b33d934e537b9bf65f0e8763df94e1e60e68828558c7e5c0 not found: ID does not exist" containerID="5a489dd0bc193766b33d934e537b9bf65f0e8763df94e1e60e68828558c7e5c0" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.520114 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a489dd0bc193766b33d934e537b9bf65f0e8763df94e1e60e68828558c7e5c0"} err="failed to get container status \"5a489dd0bc193766b33d934e537b9bf65f0e8763df94e1e60e68828558c7e5c0\": rpc error: code = NotFound desc = could not find container \"5a489dd0bc193766b33d934e537b9bf65f0e8763df94e1e60e68828558c7e5c0\": container with ID starting with 5a489dd0bc193766b33d934e537b9bf65f0e8763df94e1e60e68828558c7e5c0 not found: ID does not exist" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.520142 4992 scope.go:117] "RemoveContainer" containerID="e9a3aca52f377bf39e6358957af9894880fe67e1bc5c05d9a44cbd9dffd8fa3c" Jan 31 10:33:30 crc kubenswrapper[4992]: E0131 10:33:30.520475 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a3aca52f377bf39e6358957af9894880fe67e1bc5c05d9a44cbd9dffd8fa3c\": container with ID starting with e9a3aca52f377bf39e6358957af9894880fe67e1bc5c05d9a44cbd9dffd8fa3c not found: ID does not exist" containerID="e9a3aca52f377bf39e6358957af9894880fe67e1bc5c05d9a44cbd9dffd8fa3c" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.520500 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a3aca52f377bf39e6358957af9894880fe67e1bc5c05d9a44cbd9dffd8fa3c"} err="failed to get container status \"e9a3aca52f377bf39e6358957af9894880fe67e1bc5c05d9a44cbd9dffd8fa3c\": rpc error: code = NotFound desc = could not find container \"e9a3aca52f377bf39e6358957af9894880fe67e1bc5c05d9a44cbd9dffd8fa3c\": container with ID starting with e9a3aca52f377bf39e6358957af9894880fe67e1bc5c05d9a44cbd9dffd8fa3c not found: ID does not exist" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.520517 4992 scope.go:117] "RemoveContainer" containerID="2e9e1ed653d6e628bf2f1aa8418d5d240b477f56c06febe1a20234fc890fb53d" Jan 31 10:33:30 crc kubenswrapper[4992]: E0131 10:33:30.520706 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e9e1ed653d6e628bf2f1aa8418d5d240b477f56c06febe1a20234fc890fb53d\": container with ID starting with 2e9e1ed653d6e628bf2f1aa8418d5d240b477f56c06febe1a20234fc890fb53d not found: ID does not exist" containerID="2e9e1ed653d6e628bf2f1aa8418d5d240b477f56c06febe1a20234fc890fb53d" Jan 31 10:33:30 crc kubenswrapper[4992]: I0131 10:33:30.520730 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e9e1ed653d6e628bf2f1aa8418d5d240b477f56c06febe1a20234fc890fb53d"} err="failed to get container status \"2e9e1ed653d6e628bf2f1aa8418d5d240b477f56c06febe1a20234fc890fb53d\": rpc error: code = NotFound desc = could not find container \"2e9e1ed653d6e628bf2f1aa8418d5d240b477f56c06febe1a20234fc890fb53d\": container with ID starting with 2e9e1ed653d6e628bf2f1aa8418d5d240b477f56c06febe1a20234fc890fb53d not found: ID does not exist" Jan 31 10:33:31 crc kubenswrapper[4992]: I0131 10:33:31.192799 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31552c6a-46c7-478e-ac86-2c68599ca495" path="/var/lib/kubelet/pods/31552c6a-46c7-478e-ac86-2c68599ca495/volumes" Jan 31 10:33:43 crc kubenswrapper[4992]: I0131 10:33:43.541549 4992 generic.go:334] "Generic (PLEG): container finished" podID="974de630-b062-43d4-825a-af00b5f4ba2f" containerID="069014b67fe9f6176f59fa3c80a31e7fbf72fe9f46c999b342e9198e95dfd24b" exitCode=0 Jan 31 10:33:43 crc kubenswrapper[4992]: I0131 10:33:43.541735 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"974de630-b062-43d4-825a-af00b5f4ba2f","Type":"ContainerDied","Data":"069014b67fe9f6176f59fa3c80a31e7fbf72fe9f46c999b342e9198e95dfd24b"} Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.174742 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.237404 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-ceph\") pod \"974de630-b062-43d4-825a-af00b5f4ba2f\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.237474 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-openstack-config-secret\") pod \"974de630-b062-43d4-825a-af00b5f4ba2f\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.237509 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-ephemeral-temporary\") pod \"974de630-b062-43d4-825a-af00b5f4ba2f\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.237541 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-config\") pod \"974de630-b062-43d4-825a-af00b5f4ba2f\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.237634 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-kubeconfig\") pod \"974de630-b062-43d4-825a-af00b5f4ba2f\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.237699 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-private-key\") pod \"974de630-b062-43d4-825a-af00b5f4ba2f\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.237721 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-clouds-config\") pod \"974de630-b062-43d4-825a-af00b5f4ba2f\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.237746 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-ca-certs\") pod \"974de630-b062-43d4-825a-af00b5f4ba2f\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.237776 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vlxv\" (UniqueName: \"kubernetes.io/projected/974de630-b062-43d4-825a-af00b5f4ba2f-kube-api-access-4vlxv\") pod \"974de630-b062-43d4-825a-af00b5f4ba2f\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.237843 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-public-key\") pod \"974de630-b062-43d4-825a-af00b5f4ba2f\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.237866 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"974de630-b062-43d4-825a-af00b5f4ba2f\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.237966 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-ephemeral-workdir\") pod \"974de630-b062-43d4-825a-af00b5f4ba2f\" (UID: \"974de630-b062-43d4-825a-af00b5f4ba2f\") " Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.238644 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "974de630-b062-43d4-825a-af00b5f4ba2f" (UID: "974de630-b062-43d4-825a-af00b5f4ba2f"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.250736 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "974de630-b062-43d4-825a-af00b5f4ba2f" (UID: "974de630-b062-43d4-825a-af00b5f4ba2f"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.253700 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tobiko-tests-tobiko-s01-sanity"] Jan 31 10:33:45 crc kubenswrapper[4992]: E0131 10:33:45.254103 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31552c6a-46c7-478e-ac86-2c68599ca495" containerName="extract-utilities" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.254126 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="31552c6a-46c7-478e-ac86-2c68599ca495" containerName="extract-utilities" Jan 31 10:33:45 crc kubenswrapper[4992]: E0131 10:33:45.254138 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31552c6a-46c7-478e-ac86-2c68599ca495" containerName="registry-server" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.254145 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="31552c6a-46c7-478e-ac86-2c68599ca495" containerName="registry-server" Jan 31 10:33:45 crc kubenswrapper[4992]: E0131 10:33:45.254164 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="974de630-b062-43d4-825a-af00b5f4ba2f" containerName="tobiko-tests-tobiko" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.254171 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="974de630-b062-43d4-825a-af00b5f4ba2f" containerName="tobiko-tests-tobiko" Jan 31 10:33:45 crc kubenswrapper[4992]: E0131 10:33:45.254179 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31552c6a-46c7-478e-ac86-2c68599ca495" containerName="extract-content" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.254184 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="31552c6a-46c7-478e-ac86-2c68599ca495" containerName="extract-content" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.254638 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/974de630-b062-43d4-825a-af00b5f4ba2f-kube-api-access-4vlxv" (OuterVolumeSpecName: "kube-api-access-4vlxv") pod "974de630-b062-43d4-825a-af00b5f4ba2f" (UID: "974de630-b062-43d4-825a-af00b5f4ba2f"). InnerVolumeSpecName "kube-api-access-4vlxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.254682 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="31552c6a-46c7-478e-ac86-2c68599ca495" containerName="registry-server" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.254714 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="974de630-b062-43d4-825a-af00b5f4ba2f" containerName="tobiko-tests-tobiko" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.255472 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.266646 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-config" (OuterVolumeSpecName: "tobiko-config") pod "974de630-b062-43d4-825a-af00b5f4ba2f" (UID: "974de630-b062-43d4-825a-af00b5f4ba2f"). InnerVolumeSpecName "tobiko-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.281524 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-ceph" (OuterVolumeSpecName: "ceph") pod "974de630-b062-43d4-825a-af00b5f4ba2f" (UID: "974de630-b062-43d4-825a-af00b5f4ba2f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.285244 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-private-key" (OuterVolumeSpecName: "tobiko-private-key") pod "974de630-b062-43d4-825a-af00b5f4ba2f" (UID: "974de630-b062-43d4-825a-af00b5f4ba2f"). InnerVolumeSpecName "tobiko-private-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.291235 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "974de630-b062-43d4-825a-af00b5f4ba2f" (UID: "974de630-b062-43d4-825a-af00b5f4ba2f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.293966 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-public-key" (OuterVolumeSpecName: "tobiko-public-key") pod "974de630-b062-43d4-825a-af00b5f4ba2f" (UID: "974de630-b062-43d4-825a-af00b5f4ba2f"). InnerVolumeSpecName "tobiko-public-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.299694 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s01-sanity"] Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.301301 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.301350 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.313731 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-kubeconfig" (OuterVolumeSpecName: "kubeconfig") pod "974de630-b062-43d4-825a-af00b5f4ba2f" (UID: "974de630-b062-43d4-825a-af00b5f4ba2f"). InnerVolumeSpecName "kubeconfig". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.319109 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-clouds-config" (OuterVolumeSpecName: "test-operator-clouds-config") pod "974de630-b062-43d4-825a-af00b5f4ba2f" (UID: "974de630-b062-43d4-825a-af00b5f4ba2f"). InnerVolumeSpecName "test-operator-clouds-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.328646 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "974de630-b062-43d4-825a-af00b5f4ba2f" (UID: "974de630-b062-43d4-825a-af00b5f4ba2f"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340033 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-openstack-config-secret\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340122 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340161 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340208 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340250 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-ceph\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340281 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnc7r\" (UniqueName: \"kubernetes.io/projected/046d2b54-215c-47f0-82a6-9f8ea63414bf-kube-api-access-bnc7r\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340340 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-public-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340374 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-private-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340438 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-kubeconfig\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340468 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340525 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-ca-certs\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340571 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340693 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340719 4992 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340732 4992 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340745 4992 reconciler_common.go:293] "Volume detached for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-config\") on node \"crc\" DevicePath \"\"" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340758 4992 reconciler_common.go:293] "Volume detached for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-kubeconfig\") on node \"crc\" DevicePath \"\"" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340769 4992 reconciler_common.go:293] "Volume detached for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-private-key\") on node \"crc\" DevicePath \"\"" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340781 4992 reconciler_common.go:293] "Volume detached for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-clouds-config\") on node \"crc\" DevicePath \"\"" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340793 4992 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/974de630-b062-43d4-825a-af00b5f4ba2f-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340806 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vlxv\" (UniqueName: \"kubernetes.io/projected/974de630-b062-43d4-825a-af00b5f4ba2f-kube-api-access-4vlxv\") on node \"crc\" DevicePath \"\"" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.340825 4992 reconciler_common.go:293] "Volume detached for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/974de630-b062-43d4-825a-af00b5f4ba2f-tobiko-public-key\") on node \"crc\" DevicePath \"\"" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.372288 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.443028 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.443100 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-openstack-config-secret\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.443157 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.443194 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.443252 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-ceph\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.443313 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnc7r\" (UniqueName: \"kubernetes.io/projected/046d2b54-215c-47f0-82a6-9f8ea63414bf-kube-api-access-bnc7r\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.443379 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-public-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.443409 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-private-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.443476 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-kubeconfig\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.443503 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.443567 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-ca-certs\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.445209 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.448036 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-clouds-config\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.450853 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-ephemeral-workdir\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.450954 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-ephemeral-temporary\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.452362 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-ca-certs\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.464329 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-kubeconfig\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.464492 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-private-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.464854 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-openstack-config-secret\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.465614 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-public-key\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.466075 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-ceph\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.468113 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnc7r\" (UniqueName: \"kubernetes.io/projected/046d2b54-215c-47f0-82a6-9f8ea63414bf-kube-api-access-bnc7r\") pod \"tobiko-tests-tobiko-s01-sanity\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.571807 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" event={"ID":"974de630-b062-43d4-825a-af00b5f4ba2f","Type":"ContainerDied","Data":"6f524cd5ccafdeaa2885dcda6e9d16ef313224a0d7cecdc13f3b25bcfa641db9"} Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.571865 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f524cd5ccafdeaa2885dcda6e9d16ef313224a0d7cecdc13f3b25bcfa641db9" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.571873 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s00-podified-functional" Jan 31 10:33:45 crc kubenswrapper[4992]: I0131 10:33:45.700353 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:33:46 crc kubenswrapper[4992]: I0131 10:33:46.265637 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tobiko-tests-tobiko-s01-sanity"] Jan 31 10:33:46 crc kubenswrapper[4992]: I0131 10:33:46.543142 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "974de630-b062-43d4-825a-af00b5f4ba2f" (UID: "974de630-b062-43d4-825a-af00b5f4ba2f"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:33:46 crc kubenswrapper[4992]: I0131 10:33:46.567450 4992 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/974de630-b062-43d4-825a-af00b5f4ba2f-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 31 10:33:46 crc kubenswrapper[4992]: I0131 10:33:46.582885 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"046d2b54-215c-47f0-82a6-9f8ea63414bf","Type":"ContainerStarted","Data":"5c739167b1f65e197a009b3a61dc6da349e9e1986d019d25b8eaf82d7728868d"} Jan 31 10:33:47 crc kubenswrapper[4992]: I0131 10:33:47.591544 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"046d2b54-215c-47f0-82a6-9f8ea63414bf","Type":"ContainerStarted","Data":"cf42bfab2f3130b0c59ef26db67f9cd9adf447737f54b82c7162629c600cbce5"} Jan 31 10:33:47 crc kubenswrapper[4992]: I0131 10:33:47.617901 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tobiko-tests-tobiko-s01-sanity" podStartSLOduration=2.617882712 podStartE2EDuration="2.617882712s" podCreationTimestamp="2026-01-31 10:33:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 10:33:47.609915505 +0000 UTC m=+4123.581307502" watchObservedRunningTime="2026-01-31 10:33:47.617882712 +0000 UTC m=+4123.589274699" Jan 31 10:34:15 crc kubenswrapper[4992]: I0131 10:34:15.301469 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:34:15 crc kubenswrapper[4992]: I0131 10:34:15.301878 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:34:45 crc kubenswrapper[4992]: I0131 10:34:45.303210 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:34:45 crc kubenswrapper[4992]: I0131 10:34:45.304022 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:34:45 crc kubenswrapper[4992]: I0131 10:34:45.304104 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 10:34:45 crc kubenswrapper[4992]: I0131 10:34:45.305408 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a0e3642d17199772632eec18bcc072f796c75cfc96e2995d9dbdd6cc4109275"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 10:34:45 crc kubenswrapper[4992]: I0131 10:34:45.305525 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://0a0e3642d17199772632eec18bcc072f796c75cfc96e2995d9dbdd6cc4109275" gracePeriod=600 Jan 31 10:34:45 crc kubenswrapper[4992]: I0131 10:34:45.456091 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="0a0e3642d17199772632eec18bcc072f796c75cfc96e2995d9dbdd6cc4109275" exitCode=0 Jan 31 10:34:45 crc kubenswrapper[4992]: I0131 10:34:45.456146 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"0a0e3642d17199772632eec18bcc072f796c75cfc96e2995d9dbdd6cc4109275"} Jan 31 10:34:45 crc kubenswrapper[4992]: I0131 10:34:45.456205 4992 scope.go:117] "RemoveContainer" containerID="c4e8533b339f9d2e5183364d697778b7b9c69376ddc558138b159fec27aae9eb" Jan 31 10:34:46 crc kubenswrapper[4992]: I0131 10:34:46.472180 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e"} Jan 31 10:35:11 crc kubenswrapper[4992]: I0131 10:35:11.709008 4992 generic.go:334] "Generic (PLEG): container finished" podID="046d2b54-215c-47f0-82a6-9f8ea63414bf" containerID="cf42bfab2f3130b0c59ef26db67f9cd9adf447737f54b82c7162629c600cbce5" exitCode=0 Jan 31 10:35:11 crc kubenswrapper[4992]: I0131 10:35:11.709088 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"046d2b54-215c-47f0-82a6-9f8ea63414bf","Type":"ContainerDied","Data":"cf42bfab2f3130b0c59ef26db67f9cd9adf447737f54b82c7162629c600cbce5"} Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.204362 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.351834 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-ceph\") pod \"046d2b54-215c-47f0-82a6-9f8ea63414bf\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.351891 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-ephemeral-workdir\") pod \"046d2b54-215c-47f0-82a6-9f8ea63414bf\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.351949 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-kubeconfig\") pod \"046d2b54-215c-47f0-82a6-9f8ea63414bf\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.352013 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"046d2b54-215c-47f0-82a6-9f8ea63414bf\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.352053 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-openstack-config-secret\") pod \"046d2b54-215c-47f0-82a6-9f8ea63414bf\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.352068 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-ca-certs\") pod \"046d2b54-215c-47f0-82a6-9f8ea63414bf\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.352108 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-config\") pod \"046d2b54-215c-47f0-82a6-9f8ea63414bf\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.352126 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-clouds-config\") pod \"046d2b54-215c-47f0-82a6-9f8ea63414bf\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.352153 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnc7r\" (UniqueName: \"kubernetes.io/projected/046d2b54-215c-47f0-82a6-9f8ea63414bf-kube-api-access-bnc7r\") pod \"046d2b54-215c-47f0-82a6-9f8ea63414bf\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.352200 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-private-key\") pod \"046d2b54-215c-47f0-82a6-9f8ea63414bf\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.352230 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-public-key\") pod \"046d2b54-215c-47f0-82a6-9f8ea63414bf\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.352250 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-ephemeral-temporary\") pod \"046d2b54-215c-47f0-82a6-9f8ea63414bf\" (UID: \"046d2b54-215c-47f0-82a6-9f8ea63414bf\") " Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.354696 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "046d2b54-215c-47f0-82a6-9f8ea63414bf" (UID: "046d2b54-215c-47f0-82a6-9f8ea63414bf"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.357949 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/046d2b54-215c-47f0-82a6-9f8ea63414bf-kube-api-access-bnc7r" (OuterVolumeSpecName: "kube-api-access-bnc7r") pod "046d2b54-215c-47f0-82a6-9f8ea63414bf" (UID: "046d2b54-215c-47f0-82a6-9f8ea63414bf"). InnerVolumeSpecName "kube-api-access-bnc7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.360554 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "046d2b54-215c-47f0-82a6-9f8ea63414bf" (UID: "046d2b54-215c-47f0-82a6-9f8ea63414bf"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.367468 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-ceph" (OuterVolumeSpecName: "ceph") pod "046d2b54-215c-47f0-82a6-9f8ea63414bf" (UID: "046d2b54-215c-47f0-82a6-9f8ea63414bf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.378669 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-config" (OuterVolumeSpecName: "tobiko-config") pod "046d2b54-215c-47f0-82a6-9f8ea63414bf" (UID: "046d2b54-215c-47f0-82a6-9f8ea63414bf"). InnerVolumeSpecName "tobiko-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.379367 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-public-key" (OuterVolumeSpecName: "tobiko-public-key") pod "046d2b54-215c-47f0-82a6-9f8ea63414bf" (UID: "046d2b54-215c-47f0-82a6-9f8ea63414bf"). InnerVolumeSpecName "tobiko-public-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.381175 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-private-key" (OuterVolumeSpecName: "tobiko-private-key") pod "046d2b54-215c-47f0-82a6-9f8ea63414bf" (UID: "046d2b54-215c-47f0-82a6-9f8ea63414bf"). InnerVolumeSpecName "tobiko-private-key". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.387290 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-kubeconfig" (OuterVolumeSpecName: "kubeconfig") pod "046d2b54-215c-47f0-82a6-9f8ea63414bf" (UID: "046d2b54-215c-47f0-82a6-9f8ea63414bf"). InnerVolumeSpecName "kubeconfig". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.411893 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "046d2b54-215c-47f0-82a6-9f8ea63414bf" (UID: "046d2b54-215c-47f0-82a6-9f8ea63414bf"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.413272 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "046d2b54-215c-47f0-82a6-9f8ea63414bf" (UID: "046d2b54-215c-47f0-82a6-9f8ea63414bf"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.421086 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-clouds-config" (OuterVolumeSpecName: "test-operator-clouds-config") pod "046d2b54-215c-47f0-82a6-9f8ea63414bf" (UID: "046d2b54-215c-47f0-82a6-9f8ea63414bf"). InnerVolumeSpecName "test-operator-clouds-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.455667 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.455699 4992 reconciler_common.go:293] "Volume detached for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-kubeconfig\") on node \"crc\" DevicePath \"\"" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.455733 4992 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.455745 4992 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.455756 4992 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/046d2b54-215c-47f0-82a6-9f8ea63414bf-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.455765 4992 reconciler_common.go:293] "Volume detached for volume \"tobiko-config\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-config\") on node \"crc\" DevicePath \"\"" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.455775 4992 reconciler_common.go:293] "Volume detached for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-clouds-config\") on node \"crc\" DevicePath \"\"" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.455783 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnc7r\" (UniqueName: \"kubernetes.io/projected/046d2b54-215c-47f0-82a6-9f8ea63414bf-kube-api-access-bnc7r\") on node \"crc\" DevicePath \"\"" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.455792 4992 reconciler_common.go:293] "Volume detached for volume \"tobiko-private-key\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-private-key\") on node \"crc\" DevicePath \"\"" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.455800 4992 reconciler_common.go:293] "Volume detached for volume \"tobiko-public-key\" (UniqueName: \"kubernetes.io/configmap/046d2b54-215c-47f0-82a6-9f8ea63414bf-tobiko-public-key\") on node \"crc\" DevicePath \"\"" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.455809 4992 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.478617 4992 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.557840 4992 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.741178 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tobiko-tests-tobiko-s01-sanity" event={"ID":"046d2b54-215c-47f0-82a6-9f8ea63414bf","Type":"ContainerDied","Data":"5c739167b1f65e197a009b3a61dc6da349e9e1986d019d25b8eaf82d7728868d"} Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.741226 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c739167b1f65e197a009b3a61dc6da349e9e1986d019d25b8eaf82d7728868d" Jan 31 10:35:13 crc kubenswrapper[4992]: I0131 10:35:13.741340 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tobiko-tests-tobiko-s01-sanity" Jan 31 10:35:14 crc kubenswrapper[4992]: I0131 10:35:14.988553 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "046d2b54-215c-47f0-82a6-9f8ea63414bf" (UID: "046d2b54-215c-47f0-82a6-9f8ea63414bf"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:35:15 crc kubenswrapper[4992]: I0131 10:35:15.089160 4992 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/046d2b54-215c-47f0-82a6-9f8ea63414bf-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 31 10:35:19 crc kubenswrapper[4992]: I0131 10:35:19.336724 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko"] Jan 31 10:35:19 crc kubenswrapper[4992]: E0131 10:35:19.338979 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="046d2b54-215c-47f0-82a6-9f8ea63414bf" containerName="tobiko-tests-tobiko" Jan 31 10:35:19 crc kubenswrapper[4992]: I0131 10:35:19.339289 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="046d2b54-215c-47f0-82a6-9f8ea63414bf" containerName="tobiko-tests-tobiko" Jan 31 10:35:19 crc kubenswrapper[4992]: I0131 10:35:19.339707 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="046d2b54-215c-47f0-82a6-9f8ea63414bf" containerName="tobiko-tests-tobiko" Jan 31 10:35:19 crc kubenswrapper[4992]: I0131 10:35:19.340889 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Jan 31 10:35:19 crc kubenswrapper[4992]: I0131 10:35:19.366804 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko"] Jan 31 10:35:19 crc kubenswrapper[4992]: I0131 10:35:19.485638 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"a231b005-beb8-47f3-9dce-53a7459528f5\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Jan 31 10:35:19 crc kubenswrapper[4992]: I0131 10:35:19.485825 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g422j\" (UniqueName: \"kubernetes.io/projected/a231b005-beb8-47f3-9dce-53a7459528f5-kube-api-access-g422j\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"a231b005-beb8-47f3-9dce-53a7459528f5\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Jan 31 10:35:19 crc kubenswrapper[4992]: I0131 10:35:19.588177 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g422j\" (UniqueName: \"kubernetes.io/projected/a231b005-beb8-47f3-9dce-53a7459528f5-kube-api-access-g422j\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"a231b005-beb8-47f3-9dce-53a7459528f5\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Jan 31 10:35:19 crc kubenswrapper[4992]: I0131 10:35:19.588333 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"a231b005-beb8-47f3-9dce-53a7459528f5\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Jan 31 10:35:19 crc kubenswrapper[4992]: I0131 10:35:19.589114 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"a231b005-beb8-47f3-9dce-53a7459528f5\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Jan 31 10:35:19 crc kubenswrapper[4992]: I0131 10:35:19.617056 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g422j\" (UniqueName: \"kubernetes.io/projected/a231b005-beb8-47f3-9dce-53a7459528f5-kube-api-access-g422j\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"a231b005-beb8-47f3-9dce-53a7459528f5\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Jan 31 10:35:19 crc kubenswrapper[4992]: I0131 10:35:19.643001 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tobiko-tobiko-tests-tobiko\" (UID: \"a231b005-beb8-47f3-9dce-53a7459528f5\") " pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Jan 31 10:35:19 crc kubenswrapper[4992]: I0131 10:35:19.668401 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" Jan 31 10:35:20 crc kubenswrapper[4992]: I0131 10:35:20.144522 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko"] Jan 31 10:35:20 crc kubenswrapper[4992]: I0131 10:35:20.820465 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" event={"ID":"a231b005-beb8-47f3-9dce-53a7459528f5","Type":"ContainerStarted","Data":"09277c4dac2c9a76888c68c5a30705534e0349327864dc75c063f547262d250a"} Jan 31 10:35:21 crc kubenswrapper[4992]: I0131 10:35:21.834960 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" event={"ID":"a231b005-beb8-47f3-9dce-53a7459528f5","Type":"ContainerStarted","Data":"f5af225dcc7ef56184e9a8397842c0ea0999f9afeccd3b726b830ba4bffc6972"} Jan 31 10:35:21 crc kubenswrapper[4992]: I0131 10:35:21.853648 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tobiko-tobiko-tests-tobiko" podStartSLOduration=2.412239117 podStartE2EDuration="2.853626151s" podCreationTimestamp="2026-01-31 10:35:19 +0000 UTC" firstStartedPulling="2026-01-31 10:35:20.143972806 +0000 UTC m=+4216.115364803" lastFinishedPulling="2026-01-31 10:35:20.58535985 +0000 UTC m=+4216.556751837" observedRunningTime="2026-01-31 10:35:21.852273162 +0000 UTC m=+4217.823665169" watchObservedRunningTime="2026-01-31 10:35:21.853626151 +0000 UTC m=+4217.825018148" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.299994 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ansibletest-ansibletest"] Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.302289 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.304585 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.306617 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.311534 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ansibletest-ansibletest"] Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.394820 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-openstack-config-secret\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.394876 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6f390e65-a705-4a9b-a0af-3c22d4cd4193-test-operator-ephemeral-temporary\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.394941 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.394974 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-compute-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.395008 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6f390e65-a705-4a9b-a0af-3c22d4cd4193-test-operator-ephemeral-workdir\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.395037 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-workload-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.395073 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-ca-certs\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.395298 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-ceph\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.395371 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6f390e65-a705-4a9b-a0af-3c22d4cd4193-openstack-config\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.395667 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlw42\" (UniqueName: \"kubernetes.io/projected/6f390e65-a705-4a9b-a0af-3c22d4cd4193-kube-api-access-rlw42\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.497756 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-ca-certs\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.497841 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-ceph\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.497871 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6f390e65-a705-4a9b-a0af-3c22d4cd4193-openstack-config\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.497894 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlw42\" (UniqueName: \"kubernetes.io/projected/6f390e65-a705-4a9b-a0af-3c22d4cd4193-kube-api-access-rlw42\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.497914 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-openstack-config-secret\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.497940 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6f390e65-a705-4a9b-a0af-3c22d4cd4193-test-operator-ephemeral-temporary\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.497991 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.498021 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-compute-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.498058 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6f390e65-a705-4a9b-a0af-3c22d4cd4193-test-operator-ephemeral-workdir\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.498084 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-workload-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.499288 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.500300 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6f390e65-a705-4a9b-a0af-3c22d4cd4193-openstack-config\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.501619 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6f390e65-a705-4a9b-a0af-3c22d4cd4193-test-operator-ephemeral-temporary\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.502201 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6f390e65-a705-4a9b-a0af-3c22d4cd4193-test-operator-ephemeral-workdir\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.503870 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-openstack-config-secret\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.504570 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-ca-certs\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.506484 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-compute-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.509211 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-workload-ssh-secret\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.511629 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-ceph\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.531393 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlw42\" (UniqueName: \"kubernetes.io/projected/6f390e65-a705-4a9b-a0af-3c22d4cd4193-kube-api-access-rlw42\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.555086 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ansibletest-ansibletest\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " pod="openstack/ansibletest-ansibletest" Jan 31 10:35:33 crc kubenswrapper[4992]: I0131 10:35:33.624121 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Jan 31 10:35:34 crc kubenswrapper[4992]: I0131 10:35:34.082183 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ansibletest-ansibletest"] Jan 31 10:35:34 crc kubenswrapper[4992]: W0131 10:35:34.088284 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f390e65_a705_4a9b_a0af_3c22d4cd4193.slice/crio-c7508d57727812d02d25eb9f4e8650254e635089c44037fb718b49211ec2d032 WatchSource:0}: Error finding container c7508d57727812d02d25eb9f4e8650254e635089c44037fb718b49211ec2d032: Status 404 returned error can't find the container with id c7508d57727812d02d25eb9f4e8650254e635089c44037fb718b49211ec2d032 Jan 31 10:35:34 crc kubenswrapper[4992]: I0131 10:35:34.975414 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"6f390e65-a705-4a9b-a0af-3c22d4cd4193","Type":"ContainerStarted","Data":"c7508d57727812d02d25eb9f4e8650254e635089c44037fb718b49211ec2d032"} Jan 31 10:35:48 crc kubenswrapper[4992]: E0131 10:35:48.080541 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified" Jan 31 10:35:48 crc kubenswrapper[4992]: E0131 10:35:48.081442 4992 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 31 10:35:48 crc kubenswrapper[4992]: container &Container{Name:ansibletest-ansibletest,Image:quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_ANSIBLE_EXTRA_VARS,Value:-e manual_run=false,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_FILE_EXTRA_VARS,Value:--- Jan 31 10:35:48 crc kubenswrapper[4992]: foo: bar Jan 31 10:35:48 crc kubenswrapper[4992]: ,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_GIT_BRANCH,Value:,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_GIT_REPO,Value:https://github.com/ansible/test-playbooks,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_INVENTORY,Value:localhost ansible_connection=local ansible_python_interpreter=python3 Jan 31 10:35:48 crc kubenswrapper[4992]: ,ValueFrom:nil,},EnvVar{Name:POD_ANSIBLE_PLAYBOOK,Value:./debug.yml,ValueFrom:nil,},EnvVar{Name:POD_DEBUG,Value:false,ValueFrom:nil,},EnvVar{Name:POD_INSTALL_COLLECTIONS,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{4 0} {} 4 DecimalSI},memory: {{4294967296 0} {} 4Gi BinarySI},},Requests:ResourceList{cpu: {{2 0} {} 2 DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/ansible,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/AnsibleTests/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/ansible/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/var/lib/ansible/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ca-bundle.trust.crt,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:compute-ssh-secret,ReadOnly:true,MountPath:/var/lib/ansible/.ssh/compute_id,SubPath:ssh-privatekey,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:workload-ssh-secret,ReadOnly:true,MountPath:/var/lib/ansible/test_keypair.key,SubPath:ssh-privatekey,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rlw42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN NET_RAW],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*227,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*227,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ansibletest-ansibletest_openstack(6f390e65-a705-4a9b-a0af-3c22d4cd4193): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Jan 31 10:35:48 crc kubenswrapper[4992]: > logger="UnhandledError" Jan 31 10:35:48 crc kubenswrapper[4992]: E0131 10:35:48.083581 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ansibletest-ansibletest\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ansibletest-ansibletest" podUID="6f390e65-a705-4a9b-a0af-3c22d4cd4193" Jan 31 10:35:48 crc kubenswrapper[4992]: E0131 10:35:48.110666 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ansibletest-ansibletest\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ansible-tests:current-podified\\\"\"" pod="openstack/ansibletest-ansibletest" podUID="6f390e65-a705-4a9b-a0af-3c22d4cd4193" Jan 31 10:36:03 crc kubenswrapper[4992]: I0131 10:36:03.185030 4992 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 10:36:05 crc kubenswrapper[4992]: I0131 10:36:05.322990 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"6f390e65-a705-4a9b-a0af-3c22d4cd4193","Type":"ContainerStarted","Data":"33226e3ed4325d4d1294661e556b837aefbc455d577a3741f9605f15abbb93d8"} Jan 31 10:36:05 crc kubenswrapper[4992]: I0131 10:36:05.352781 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ansibletest-ansibletest" podStartSLOduration=3.628127251 podStartE2EDuration="33.352760497s" podCreationTimestamp="2026-01-31 10:35:32 +0000 UTC" firstStartedPulling="2026-01-31 10:35:34.090834683 +0000 UTC m=+4230.062226680" lastFinishedPulling="2026-01-31 10:36:03.815467909 +0000 UTC m=+4259.786859926" observedRunningTime="2026-01-31 10:36:05.348572558 +0000 UTC m=+4261.319964565" watchObservedRunningTime="2026-01-31 10:36:05.352760497 +0000 UTC m=+4261.324152494" Jan 31 10:36:07 crc kubenswrapper[4992]: I0131 10:36:07.341512 4992 generic.go:334] "Generic (PLEG): container finished" podID="6f390e65-a705-4a9b-a0af-3c22d4cd4193" containerID="33226e3ed4325d4d1294661e556b837aefbc455d577a3741f9605f15abbb93d8" exitCode=0 Jan 31 10:36:07 crc kubenswrapper[4992]: I0131 10:36:07.341567 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"6f390e65-a705-4a9b-a0af-3c22d4cd4193","Type":"ContainerDied","Data":"33226e3ed4325d4d1294661e556b837aefbc455d577a3741f9605f15abbb93d8"} Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.778350 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.907722 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6f390e65-a705-4a9b-a0af-3c22d4cd4193-test-operator-ephemeral-temporary\") pod \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.907786 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-ca-certs\") pod \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.907831 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-workload-ssh-secret\") pod \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.907882 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6f390e65-a705-4a9b-a0af-3c22d4cd4193-openstack-config\") pod \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.907968 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-compute-ssh-secret\") pod \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.908137 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f390e65-a705-4a9b-a0af-3c22d4cd4193-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "6f390e65-a705-4a9b-a0af-3c22d4cd4193" (UID: "6f390e65-a705-4a9b-a0af-3c22d4cd4193"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.908546 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-openstack-config-secret\") pod \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.908650 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.908693 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlw42\" (UniqueName: \"kubernetes.io/projected/6f390e65-a705-4a9b-a0af-3c22d4cd4193-kube-api-access-rlw42\") pod \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.908821 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-ceph\") pod \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.908905 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6f390e65-a705-4a9b-a0af-3c22d4cd4193-test-operator-ephemeral-workdir\") pod \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\" (UID: \"6f390e65-a705-4a9b-a0af-3c22d4cd4193\") " Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.910885 4992 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6f390e65-a705-4a9b-a0af-3c22d4cd4193-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.913843 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f390e65-a705-4a9b-a0af-3c22d4cd4193-kube-api-access-rlw42" (OuterVolumeSpecName: "kube-api-access-rlw42") pod "6f390e65-a705-4a9b-a0af-3c22d4cd4193" (UID: "6f390e65-a705-4a9b-a0af-3c22d4cd4193"). InnerVolumeSpecName "kube-api-access-rlw42". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.914527 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-ceph" (OuterVolumeSpecName: "ceph") pod "6f390e65-a705-4a9b-a0af-3c22d4cd4193" (UID: "6f390e65-a705-4a9b-a0af-3c22d4cd4193"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.915222 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "6f390e65-a705-4a9b-a0af-3c22d4cd4193" (UID: "6f390e65-a705-4a9b-a0af-3c22d4cd4193"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.927933 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f390e65-a705-4a9b-a0af-3c22d4cd4193-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "6f390e65-a705-4a9b-a0af-3c22d4cd4193" (UID: "6f390e65-a705-4a9b-a0af-3c22d4cd4193"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.947301 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-compute-ssh-secret" (OuterVolumeSpecName: "compute-ssh-secret") pod "6f390e65-a705-4a9b-a0af-3c22d4cd4193" (UID: "6f390e65-a705-4a9b-a0af-3c22d4cd4193"). InnerVolumeSpecName "compute-ssh-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.949240 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6f390e65-a705-4a9b-a0af-3c22d4cd4193" (UID: "6f390e65-a705-4a9b-a0af-3c22d4cd4193"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.951958 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-workload-ssh-secret" (OuterVolumeSpecName: "workload-ssh-secret") pod "6f390e65-a705-4a9b-a0af-3c22d4cd4193" (UID: "6f390e65-a705-4a9b-a0af-3c22d4cd4193"). InnerVolumeSpecName "workload-ssh-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.966283 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "6f390e65-a705-4a9b-a0af-3c22d4cd4193" (UID: "6f390e65-a705-4a9b-a0af-3c22d4cd4193"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:36:08 crc kubenswrapper[4992]: I0131 10:36:08.989444 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f390e65-a705-4a9b-a0af-3c22d4cd4193-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6f390e65-a705-4a9b-a0af-3c22d4cd4193" (UID: "6f390e65-a705-4a9b-a0af-3c22d4cd4193"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:36:09 crc kubenswrapper[4992]: I0131 10:36:09.012808 4992 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6f390e65-a705-4a9b-a0af-3c22d4cd4193-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 31 10:36:09 crc kubenswrapper[4992]: I0131 10:36:09.012841 4992 reconciler_common.go:293] "Volume detached for volume \"compute-ssh-secret\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-compute-ssh-secret\") on node \"crc\" DevicePath \"\"" Jan 31 10:36:09 crc kubenswrapper[4992]: I0131 10:36:09.012853 4992 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 31 10:36:09 crc kubenswrapper[4992]: I0131 10:36:09.012902 4992 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 31 10:36:09 crc kubenswrapper[4992]: I0131 10:36:09.012912 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlw42\" (UniqueName: \"kubernetes.io/projected/6f390e65-a705-4a9b-a0af-3c22d4cd4193-kube-api-access-rlw42\") on node \"crc\" DevicePath \"\"" Jan 31 10:36:09 crc kubenswrapper[4992]: I0131 10:36:09.012922 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:36:09 crc kubenswrapper[4992]: I0131 10:36:09.012933 4992 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6f390e65-a705-4a9b-a0af-3c22d4cd4193-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 31 10:36:09 crc kubenswrapper[4992]: I0131 10:36:09.012942 4992 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 31 10:36:09 crc kubenswrapper[4992]: I0131 10:36:09.012951 4992 reconciler_common.go:293] "Volume detached for volume \"workload-ssh-secret\" (UniqueName: \"kubernetes.io/secret/6f390e65-a705-4a9b-a0af-3c22d4cd4193-workload-ssh-secret\") on node \"crc\" DevicePath \"\"" Jan 31 10:36:09 crc kubenswrapper[4992]: I0131 10:36:09.038058 4992 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 31 10:36:09 crc kubenswrapper[4992]: I0131 10:36:09.114949 4992 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 31 10:36:09 crc kubenswrapper[4992]: I0131 10:36:09.360592 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ansibletest-ansibletest" event={"ID":"6f390e65-a705-4a9b-a0af-3c22d4cd4193","Type":"ContainerDied","Data":"c7508d57727812d02d25eb9f4e8650254e635089c44037fb718b49211ec2d032"} Jan 31 10:36:09 crc kubenswrapper[4992]: I0131 10:36:09.360635 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7508d57727812d02d25eb9f4e8650254e635089c44037fb718b49211ec2d032" Jan 31 10:36:09 crc kubenswrapper[4992]: I0131 10:36:09.360679 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ansibletest-ansibletest" Jan 31 10:36:17 crc kubenswrapper[4992]: I0131 10:36:17.913504 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest"] Jan 31 10:36:17 crc kubenswrapper[4992]: E0131 10:36:17.914647 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f390e65-a705-4a9b-a0af-3c22d4cd4193" containerName="ansibletest-ansibletest" Jan 31 10:36:17 crc kubenswrapper[4992]: I0131 10:36:17.914669 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f390e65-a705-4a9b-a0af-3c22d4cd4193" containerName="ansibletest-ansibletest" Jan 31 10:36:17 crc kubenswrapper[4992]: I0131 10:36:17.915193 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f390e65-a705-4a9b-a0af-3c22d4cd4193" containerName="ansibletest-ansibletest" Jan 31 10:36:17 crc kubenswrapper[4992]: I0131 10:36:17.916170 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Jan 31 10:36:17 crc kubenswrapper[4992]: I0131 10:36:17.935487 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest"] Jan 31 10:36:18 crc kubenswrapper[4992]: I0131 10:36:18.007302 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8tfv\" (UniqueName: \"kubernetes.io/projected/0b361a5e-e5b9-4612-a165-09280c301ac8-kube-api-access-m8tfv\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"0b361a5e-e5b9-4612-a165-09280c301ac8\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Jan 31 10:36:18 crc kubenswrapper[4992]: I0131 10:36:18.007384 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"0b361a5e-e5b9-4612-a165-09280c301ac8\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Jan 31 10:36:18 crc kubenswrapper[4992]: I0131 10:36:18.225171 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8tfv\" (UniqueName: \"kubernetes.io/projected/0b361a5e-e5b9-4612-a165-09280c301ac8-kube-api-access-m8tfv\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"0b361a5e-e5b9-4612-a165-09280c301ac8\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Jan 31 10:36:18 crc kubenswrapper[4992]: I0131 10:36:18.225658 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"0b361a5e-e5b9-4612-a165-09280c301ac8\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Jan 31 10:36:18 crc kubenswrapper[4992]: I0131 10:36:18.226337 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"0b361a5e-e5b9-4612-a165-09280c301ac8\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Jan 31 10:36:18 crc kubenswrapper[4992]: I0131 10:36:18.256481 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8tfv\" (UniqueName: \"kubernetes.io/projected/0b361a5e-e5b9-4612-a165-09280c301ac8-kube-api-access-m8tfv\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"0b361a5e-e5b9-4612-a165-09280c301ac8\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Jan 31 10:36:18 crc kubenswrapper[4992]: I0131 10:36:18.282952 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-ansibletest-ansibletest-ansibletest\" (UID: \"0b361a5e-e5b9-4612-a165-09280c301ac8\") " pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Jan 31 10:36:18 crc kubenswrapper[4992]: I0131 10:36:18.549094 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" Jan 31 10:36:19 crc kubenswrapper[4992]: I0131 10:36:19.044555 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest"] Jan 31 10:36:19 crc kubenswrapper[4992]: I0131 10:36:19.468598 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" event={"ID":"0b361a5e-e5b9-4612-a165-09280c301ac8","Type":"ContainerStarted","Data":"c338f6b300ecb38214f7f649feb26cbbd5db251479b4ca0ff0123dc5e0139a88"} Jan 31 10:36:20 crc kubenswrapper[4992]: I0131 10:36:20.482705 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" event={"ID":"0b361a5e-e5b9-4612-a165-09280c301ac8","Type":"ContainerStarted","Data":"9eeebe57e75c563dc68387f7475a1547c632059978e9c3fe0e7004a31fcc0972"} Jan 31 10:36:20 crc kubenswrapper[4992]: I0131 10:36:20.503734 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-ansibletest-ansibletest-ansibletest" podStartSLOduration=3.067051959 podStartE2EDuration="3.503710399s" podCreationTimestamp="2026-01-31 10:36:17 +0000 UTC" firstStartedPulling="2026-01-31 10:36:19.048627579 +0000 UTC m=+4275.020019606" lastFinishedPulling="2026-01-31 10:36:19.485286049 +0000 UTC m=+4275.456678046" observedRunningTime="2026-01-31 10:36:20.500892768 +0000 UTC m=+4276.472284815" watchObservedRunningTime="2026-01-31 10:36:20.503710399 +0000 UTC m=+4276.475102396" Jan 31 10:36:24 crc kubenswrapper[4992]: I0131 10:36:24.717371 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-grt8c"] Jan 31 10:36:24 crc kubenswrapper[4992]: I0131 10:36:24.720549 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:36:24 crc kubenswrapper[4992]: I0131 10:36:24.733710 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grt8c"] Jan 31 10:36:24 crc kubenswrapper[4992]: I0131 10:36:24.919314 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rbjs\" (UniqueName: \"kubernetes.io/projected/872d176d-6650-4a80-bfed-7766f112f4b6-kube-api-access-5rbjs\") pod \"redhat-operators-grt8c\" (UID: \"872d176d-6650-4a80-bfed-7766f112f4b6\") " pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:36:24 crc kubenswrapper[4992]: I0131 10:36:24.919496 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/872d176d-6650-4a80-bfed-7766f112f4b6-catalog-content\") pod \"redhat-operators-grt8c\" (UID: \"872d176d-6650-4a80-bfed-7766f112f4b6\") " pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:36:24 crc kubenswrapper[4992]: I0131 10:36:24.919536 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/872d176d-6650-4a80-bfed-7766f112f4b6-utilities\") pod \"redhat-operators-grt8c\" (UID: \"872d176d-6650-4a80-bfed-7766f112f4b6\") " pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:36:25 crc kubenswrapper[4992]: I0131 10:36:25.021526 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/872d176d-6650-4a80-bfed-7766f112f4b6-catalog-content\") pod \"redhat-operators-grt8c\" (UID: \"872d176d-6650-4a80-bfed-7766f112f4b6\") " pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:36:25 crc kubenswrapper[4992]: I0131 10:36:25.021618 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/872d176d-6650-4a80-bfed-7766f112f4b6-utilities\") pod \"redhat-operators-grt8c\" (UID: \"872d176d-6650-4a80-bfed-7766f112f4b6\") " pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:36:25 crc kubenswrapper[4992]: I0131 10:36:25.021890 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rbjs\" (UniqueName: \"kubernetes.io/projected/872d176d-6650-4a80-bfed-7766f112f4b6-kube-api-access-5rbjs\") pod \"redhat-operators-grt8c\" (UID: \"872d176d-6650-4a80-bfed-7766f112f4b6\") " pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:36:25 crc kubenswrapper[4992]: I0131 10:36:25.022216 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/872d176d-6650-4a80-bfed-7766f112f4b6-utilities\") pod \"redhat-operators-grt8c\" (UID: \"872d176d-6650-4a80-bfed-7766f112f4b6\") " pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:36:25 crc kubenswrapper[4992]: I0131 10:36:25.022689 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/872d176d-6650-4a80-bfed-7766f112f4b6-catalog-content\") pod \"redhat-operators-grt8c\" (UID: \"872d176d-6650-4a80-bfed-7766f112f4b6\") " pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:36:25 crc kubenswrapper[4992]: I0131 10:36:25.054934 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rbjs\" (UniqueName: \"kubernetes.io/projected/872d176d-6650-4a80-bfed-7766f112f4b6-kube-api-access-5rbjs\") pod \"redhat-operators-grt8c\" (UID: \"872d176d-6650-4a80-bfed-7766f112f4b6\") " pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:36:25 crc kubenswrapper[4992]: I0131 10:36:25.343444 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:36:26 crc kubenswrapper[4992]: I0131 10:36:26.552140 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grt8c"] Jan 31 10:36:27 crc kubenswrapper[4992]: I0131 10:36:27.548086 4992 generic.go:334] "Generic (PLEG): container finished" podID="872d176d-6650-4a80-bfed-7766f112f4b6" containerID="d7ab3db7277a01cf2bfcc49b3873ed84c71646c86cf4795cb808728bfe40d9f9" exitCode=0 Jan 31 10:36:27 crc kubenswrapper[4992]: I0131 10:36:27.548214 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grt8c" event={"ID":"872d176d-6650-4a80-bfed-7766f112f4b6","Type":"ContainerDied","Data":"d7ab3db7277a01cf2bfcc49b3873ed84c71646c86cf4795cb808728bfe40d9f9"} Jan 31 10:36:27 crc kubenswrapper[4992]: I0131 10:36:27.548603 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grt8c" event={"ID":"872d176d-6650-4a80-bfed-7766f112f4b6","Type":"ContainerStarted","Data":"20e109f1ff26039491d60cc47022655a3cbd0f439f8a2cd870e8dc96abb8480f"} Jan 31 10:36:28 crc kubenswrapper[4992]: I0131 10:36:28.568115 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grt8c" event={"ID":"872d176d-6650-4a80-bfed-7766f112f4b6","Type":"ContainerStarted","Data":"cf65765c60819bc3c5de0892c397588ebe4aa4b85ba086bff4772d0f4d317ad1"} Jan 31 10:36:31 crc kubenswrapper[4992]: I0131 10:36:31.610038 4992 generic.go:334] "Generic (PLEG): container finished" podID="872d176d-6650-4a80-bfed-7766f112f4b6" containerID="cf65765c60819bc3c5de0892c397588ebe4aa4b85ba086bff4772d0f4d317ad1" exitCode=0 Jan 31 10:36:31 crc kubenswrapper[4992]: I0131 10:36:31.610207 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grt8c" event={"ID":"872d176d-6650-4a80-bfed-7766f112f4b6","Type":"ContainerDied","Data":"cf65765c60819bc3c5de0892c397588ebe4aa4b85ba086bff4772d0f4d317ad1"} Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.017047 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizontest-tests-horizontest"] Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.018593 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.021569 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"test-operator-clouds-config" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.022167 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizontest-tests-horizontesthorizontest-config" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.047567 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizontest-tests-horizontest"] Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.102609 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-ephemeral-workdir\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.102999 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-ceph\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.103161 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-openstack-config-secret\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.103210 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfrmb\" (UniqueName: \"kubernetes.io/projected/8d0adaef-9167-4386-94f2-6e638914a9b5-kube-api-access-vfrmb\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.103240 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-clouds-config\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.103272 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.103297 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-ca-certs\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.103343 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-ephemeral-temporary\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.205636 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-openstack-config-secret\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.205689 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfrmb\" (UniqueName: \"kubernetes.io/projected/8d0adaef-9167-4386-94f2-6e638914a9b5-kube-api-access-vfrmb\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.205713 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-clouds-config\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.205738 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.205758 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-ca-certs\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.205794 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-ephemeral-temporary\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.205810 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-ephemeral-workdir\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.205835 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-ceph\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.206404 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.206570 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-ephemeral-temporary\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.206875 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-ephemeral-workdir\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.207076 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-clouds-config\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.212857 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-ceph\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.213257 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-ca-certs\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.214319 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-openstack-config-secret\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.232255 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfrmb\" (UniqueName: \"kubernetes.io/projected/8d0adaef-9167-4386-94f2-6e638914a9b5-kube-api-access-vfrmb\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.267043 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"horizontest-tests-horizontest\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.359519 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.624084 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grt8c" event={"ID":"872d176d-6650-4a80-bfed-7766f112f4b6","Type":"ContainerStarted","Data":"aa0de045aaa50e2abbe7c81d508030470f858a1262b69e63c8608c527af74f4b"} Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.646899 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-grt8c" podStartSLOduration=4.174699314 podStartE2EDuration="8.646873445s" podCreationTimestamp="2026-01-31 10:36:24 +0000 UTC" firstStartedPulling="2026-01-31 10:36:27.549757025 +0000 UTC m=+4283.521149012" lastFinishedPulling="2026-01-31 10:36:32.021931116 +0000 UTC m=+4287.993323143" observedRunningTime="2026-01-31 10:36:32.644802076 +0000 UTC m=+4288.616194073" watchObservedRunningTime="2026-01-31 10:36:32.646873445 +0000 UTC m=+4288.618265442" Jan 31 10:36:32 crc kubenswrapper[4992]: I0131 10:36:32.808015 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizontest-tests-horizontest"] Jan 31 10:36:32 crc kubenswrapper[4992]: W0131 10:36:32.813282 4992 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d0adaef_9167_4386_94f2_6e638914a9b5.slice/crio-f1fb9873d4085319f39fdd8386a3048b0eb6e727abadf947708549ce426bde4d WatchSource:0}: Error finding container f1fb9873d4085319f39fdd8386a3048b0eb6e727abadf947708549ce426bde4d: Status 404 returned error can't find the container with id f1fb9873d4085319f39fdd8386a3048b0eb6e727abadf947708549ce426bde4d Jan 31 10:36:33 crc kubenswrapper[4992]: I0131 10:36:33.635608 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"8d0adaef-9167-4386-94f2-6e638914a9b5","Type":"ContainerStarted","Data":"f1fb9873d4085319f39fdd8386a3048b0eb6e727abadf947708549ce426bde4d"} Jan 31 10:36:35 crc kubenswrapper[4992]: I0131 10:36:35.343994 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:36:35 crc kubenswrapper[4992]: I0131 10:36:35.344310 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:36:36 crc kubenswrapper[4992]: I0131 10:36:36.405080 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-grt8c" podUID="872d176d-6650-4a80-bfed-7766f112f4b6" containerName="registry-server" probeResult="failure" output=< Jan 31 10:36:36 crc kubenswrapper[4992]: timeout: failed to connect service ":50051" within 1s Jan 31 10:36:36 crc kubenswrapper[4992]: > Jan 31 10:36:45 crc kubenswrapper[4992]: I0131 10:36:45.301524 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:36:45 crc kubenswrapper[4992]: I0131 10:36:45.302001 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:36:46 crc kubenswrapper[4992]: I0131 10:36:46.519903 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-grt8c" podUID="872d176d-6650-4a80-bfed-7766f112f4b6" containerName="registry-server" probeResult="failure" output=< Jan 31 10:36:46 crc kubenswrapper[4992]: timeout: failed to connect service ":50051" within 1s Jan 31 10:36:46 crc kubenswrapper[4992]: > Jan 31 10:36:50 crc kubenswrapper[4992]: E0131 10:36:50.704661 4992 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizontest:current-podified" Jan 31 10:36:50 crc kubenswrapper[4992]: E0131 10:36:50.706075 4992 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizontest-tests-horizontest,Image:quay.io/podified-antelope-centos9/openstack-horizontest:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADMIN_PASSWORD,Value:12345678,ValueFrom:nil,},EnvVar{Name:ADMIN_USERNAME,Value:admin,ValueFrom:nil,},EnvVar{Name:AUTH_URL,Value:https://keystone-public-openstack.apps-crc.testing,ValueFrom:nil,},EnvVar{Name:DASHBOARD_URL,Value:https://horizon-openstack.apps-crc.testing/,ValueFrom:nil,},EnvVar{Name:EXTRA_FLAG,Value:not pagination and test_users.py,ValueFrom:nil,},EnvVar{Name:FLAVOR_NAME,Value:m1.tiny,ValueFrom:nil,},EnvVar{Name:HORIZONTEST_DEBUG_MODE,Value:false,ValueFrom:nil,},EnvVar{Name:HORIZON_KEYS_FOLDER,Value:/etc/test_operator,ValueFrom:nil,},EnvVar{Name:HORIZON_LOGS_DIR_NAME,Value:horizon,ValueFrom:nil,},EnvVar{Name:HORIZON_REPO_BRANCH,Value:master,ValueFrom:nil,},EnvVar{Name:IMAGE_FILE,Value:/var/lib/horizontest/cirros-0.6.2-x86_64-disk.img,ValueFrom:nil,},EnvVar{Name:IMAGE_FILE_NAME,Value:cirros-0.6.2-x86_64-disk,ValueFrom:nil,},EnvVar{Name:IMAGE_URL,Value:http://download.cirros-cloud.net/0.6.2/cirros-0.6.2-x86_64-disk.img,ValueFrom:nil,},EnvVar{Name:PASSWORD,Value:horizontest,ValueFrom:nil,},EnvVar{Name:PROJECT_NAME,Value:horizontest,ValueFrom:nil,},EnvVar{Name:PROJECT_NAME_XPATH,Value://*[@class=\"context-project\"]//ancestor::ul,ValueFrom:nil,},EnvVar{Name:REPO_URL,Value:https://review.opendev.org/openstack/horizon,ValueFrom:nil,},EnvVar{Name:USER_NAME,Value:horizontest,ValueFrom:nil,},EnvVar{Name:USE_EXTERNAL_FILES,Value:True,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{2 0} {} 2 DecimalSI},memory: {{4294967296 0} {} 4Gi BinarySI},},Requests:ResourceList{cpu: {{1 0} {} 1 DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/horizontest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/horizontest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-clouds-config,ReadOnly:true,MountPath:/var/lib/horizontest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-clouds-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ca-bundle.trust.crt,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ceph,ReadOnly:true,MountPath:/etc/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vfrmb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN NET_RAW],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42455,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42455,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizontest-tests-horizontest_openstack(8d0adaef-9167-4386-94f2-6e638914a9b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 10:36:50 crc kubenswrapper[4992]: E0131 10:36:50.707402 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizontest-tests-horizontest\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/horizontest-tests-horizontest" podUID="8d0adaef-9167-4386-94f2-6e638914a9b5" Jan 31 10:36:50 crc kubenswrapper[4992]: E0131 10:36:50.819000 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizontest-tests-horizontest\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizontest:current-podified\\\"\"" pod="openstack/horizontest-tests-horizontest" podUID="8d0adaef-9167-4386-94f2-6e638914a9b5" Jan 31 10:36:56 crc kubenswrapper[4992]: I0131 10:36:56.385554 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-grt8c" podUID="872d176d-6650-4a80-bfed-7766f112f4b6" containerName="registry-server" probeResult="failure" output=< Jan 31 10:36:56 crc kubenswrapper[4992]: timeout: failed to connect service ":50051" within 1s Jan 31 10:36:56 crc kubenswrapper[4992]: > Jan 31 10:37:04 crc kubenswrapper[4992]: I0131 10:37:04.937987 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"8d0adaef-9167-4386-94f2-6e638914a9b5","Type":"ContainerStarted","Data":"1778fd88a7cc3c85cfb08535ee4bf23ecc7042b015ddf09acd7fc216292f171d"} Jan 31 10:37:04 crc kubenswrapper[4992]: I0131 10:37:04.969641 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizontest-tests-horizontest" podStartSLOduration=4.136134588 podStartE2EDuration="34.969618636s" podCreationTimestamp="2026-01-31 10:36:30 +0000 UTC" firstStartedPulling="2026-01-31 10:36:32.816410231 +0000 UTC m=+4288.787802218" lastFinishedPulling="2026-01-31 10:37:03.649894269 +0000 UTC m=+4319.621286266" observedRunningTime="2026-01-31 10:37:04.960773274 +0000 UTC m=+4320.932165311" watchObservedRunningTime="2026-01-31 10:37:04.969618636 +0000 UTC m=+4320.941010633" Jan 31 10:37:05 crc kubenswrapper[4992]: I0131 10:37:05.393773 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:37:05 crc kubenswrapper[4992]: I0131 10:37:05.439637 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:37:05 crc kubenswrapper[4992]: I0131 10:37:05.638663 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grt8c"] Jan 31 10:37:06 crc kubenswrapper[4992]: I0131 10:37:06.967145 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-grt8c" podUID="872d176d-6650-4a80-bfed-7766f112f4b6" containerName="registry-server" containerID="cri-o://aa0de045aaa50e2abbe7c81d508030470f858a1262b69e63c8608c527af74f4b" gracePeriod=2 Jan 31 10:37:07 crc kubenswrapper[4992]: I0131 10:37:07.466566 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:37:07 crc kubenswrapper[4992]: I0131 10:37:07.494829 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/872d176d-6650-4a80-bfed-7766f112f4b6-utilities\") pod \"872d176d-6650-4a80-bfed-7766f112f4b6\" (UID: \"872d176d-6650-4a80-bfed-7766f112f4b6\") " Jan 31 10:37:07 crc kubenswrapper[4992]: I0131 10:37:07.494986 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/872d176d-6650-4a80-bfed-7766f112f4b6-catalog-content\") pod \"872d176d-6650-4a80-bfed-7766f112f4b6\" (UID: \"872d176d-6650-4a80-bfed-7766f112f4b6\") " Jan 31 10:37:07 crc kubenswrapper[4992]: I0131 10:37:07.496498 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/872d176d-6650-4a80-bfed-7766f112f4b6-utilities" (OuterVolumeSpecName: "utilities") pod "872d176d-6650-4a80-bfed-7766f112f4b6" (UID: "872d176d-6650-4a80-bfed-7766f112f4b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:37:07 crc kubenswrapper[4992]: I0131 10:37:07.596697 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rbjs\" (UniqueName: \"kubernetes.io/projected/872d176d-6650-4a80-bfed-7766f112f4b6-kube-api-access-5rbjs\") pod \"872d176d-6650-4a80-bfed-7766f112f4b6\" (UID: \"872d176d-6650-4a80-bfed-7766f112f4b6\") " Jan 31 10:37:07 crc kubenswrapper[4992]: I0131 10:37:07.597413 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/872d176d-6650-4a80-bfed-7766f112f4b6-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:37:07 crc kubenswrapper[4992]: I0131 10:37:07.621605 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/872d176d-6650-4a80-bfed-7766f112f4b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "872d176d-6650-4a80-bfed-7766f112f4b6" (UID: "872d176d-6650-4a80-bfed-7766f112f4b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:37:07 crc kubenswrapper[4992]: I0131 10:37:07.699563 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/872d176d-6650-4a80-bfed-7766f112f4b6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:37:07 crc kubenswrapper[4992]: I0131 10:37:07.975552 4992 generic.go:334] "Generic (PLEG): container finished" podID="872d176d-6650-4a80-bfed-7766f112f4b6" containerID="aa0de045aaa50e2abbe7c81d508030470f858a1262b69e63c8608c527af74f4b" exitCode=0 Jan 31 10:37:07 crc kubenswrapper[4992]: I0131 10:37:07.975595 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grt8c" event={"ID":"872d176d-6650-4a80-bfed-7766f112f4b6","Type":"ContainerDied","Data":"aa0de045aaa50e2abbe7c81d508030470f858a1262b69e63c8608c527af74f4b"} Jan 31 10:37:07 crc kubenswrapper[4992]: I0131 10:37:07.975613 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grt8c" Jan 31 10:37:07 crc kubenswrapper[4992]: I0131 10:37:07.975633 4992 scope.go:117] "RemoveContainer" containerID="aa0de045aaa50e2abbe7c81d508030470f858a1262b69e63c8608c527af74f4b" Jan 31 10:37:07 crc kubenswrapper[4992]: I0131 10:37:07.975620 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grt8c" event={"ID":"872d176d-6650-4a80-bfed-7766f112f4b6","Type":"ContainerDied","Data":"20e109f1ff26039491d60cc47022655a3cbd0f439f8a2cd870e8dc96abb8480f"} Jan 31 10:37:07 crc kubenswrapper[4992]: I0131 10:37:07.993158 4992 scope.go:117] "RemoveContainer" containerID="cf65765c60819bc3c5de0892c397588ebe4aa4b85ba086bff4772d0f4d317ad1" Jan 31 10:37:08 crc kubenswrapper[4992]: I0131 10:37:08.079338 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/872d176d-6650-4a80-bfed-7766f112f4b6-kube-api-access-5rbjs" (OuterVolumeSpecName: "kube-api-access-5rbjs") pod "872d176d-6650-4a80-bfed-7766f112f4b6" (UID: "872d176d-6650-4a80-bfed-7766f112f4b6"). InnerVolumeSpecName "kube-api-access-5rbjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:37:08 crc kubenswrapper[4992]: I0131 10:37:08.101728 4992 scope.go:117] "RemoveContainer" containerID="d7ab3db7277a01cf2bfcc49b3873ed84c71646c86cf4795cb808728bfe40d9f9" Jan 31 10:37:08 crc kubenswrapper[4992]: I0131 10:37:08.107876 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rbjs\" (UniqueName: \"kubernetes.io/projected/872d176d-6650-4a80-bfed-7766f112f4b6-kube-api-access-5rbjs\") on node \"crc\" DevicePath \"\"" Jan 31 10:37:08 crc kubenswrapper[4992]: I0131 10:37:08.181736 4992 scope.go:117] "RemoveContainer" containerID="aa0de045aaa50e2abbe7c81d508030470f858a1262b69e63c8608c527af74f4b" Jan 31 10:37:08 crc kubenswrapper[4992]: E0131 10:37:08.182211 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0de045aaa50e2abbe7c81d508030470f858a1262b69e63c8608c527af74f4b\": container with ID starting with aa0de045aaa50e2abbe7c81d508030470f858a1262b69e63c8608c527af74f4b not found: ID does not exist" containerID="aa0de045aaa50e2abbe7c81d508030470f858a1262b69e63c8608c527af74f4b" Jan 31 10:37:08 crc kubenswrapper[4992]: I0131 10:37:08.182255 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0de045aaa50e2abbe7c81d508030470f858a1262b69e63c8608c527af74f4b"} err="failed to get container status \"aa0de045aaa50e2abbe7c81d508030470f858a1262b69e63c8608c527af74f4b\": rpc error: code = NotFound desc = could not find container \"aa0de045aaa50e2abbe7c81d508030470f858a1262b69e63c8608c527af74f4b\": container with ID starting with aa0de045aaa50e2abbe7c81d508030470f858a1262b69e63c8608c527af74f4b not found: ID does not exist" Jan 31 10:37:08 crc kubenswrapper[4992]: I0131 10:37:08.182287 4992 scope.go:117] "RemoveContainer" containerID="cf65765c60819bc3c5de0892c397588ebe4aa4b85ba086bff4772d0f4d317ad1" Jan 31 10:37:08 crc kubenswrapper[4992]: E0131 10:37:08.182723 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf65765c60819bc3c5de0892c397588ebe4aa4b85ba086bff4772d0f4d317ad1\": container with ID starting with cf65765c60819bc3c5de0892c397588ebe4aa4b85ba086bff4772d0f4d317ad1 not found: ID does not exist" containerID="cf65765c60819bc3c5de0892c397588ebe4aa4b85ba086bff4772d0f4d317ad1" Jan 31 10:37:08 crc kubenswrapper[4992]: I0131 10:37:08.182764 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf65765c60819bc3c5de0892c397588ebe4aa4b85ba086bff4772d0f4d317ad1"} err="failed to get container status \"cf65765c60819bc3c5de0892c397588ebe4aa4b85ba086bff4772d0f4d317ad1\": rpc error: code = NotFound desc = could not find container \"cf65765c60819bc3c5de0892c397588ebe4aa4b85ba086bff4772d0f4d317ad1\": container with ID starting with cf65765c60819bc3c5de0892c397588ebe4aa4b85ba086bff4772d0f4d317ad1 not found: ID does not exist" Jan 31 10:37:08 crc kubenswrapper[4992]: I0131 10:37:08.182796 4992 scope.go:117] "RemoveContainer" containerID="d7ab3db7277a01cf2bfcc49b3873ed84c71646c86cf4795cb808728bfe40d9f9" Jan 31 10:37:08 crc kubenswrapper[4992]: E0131 10:37:08.183450 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7ab3db7277a01cf2bfcc49b3873ed84c71646c86cf4795cb808728bfe40d9f9\": container with ID starting with d7ab3db7277a01cf2bfcc49b3873ed84c71646c86cf4795cb808728bfe40d9f9 not found: ID does not exist" containerID="d7ab3db7277a01cf2bfcc49b3873ed84c71646c86cf4795cb808728bfe40d9f9" Jan 31 10:37:08 crc kubenswrapper[4992]: I0131 10:37:08.183487 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7ab3db7277a01cf2bfcc49b3873ed84c71646c86cf4795cb808728bfe40d9f9"} err="failed to get container status \"d7ab3db7277a01cf2bfcc49b3873ed84c71646c86cf4795cb808728bfe40d9f9\": rpc error: code = NotFound desc = could not find container \"d7ab3db7277a01cf2bfcc49b3873ed84c71646c86cf4795cb808728bfe40d9f9\": container with ID starting with d7ab3db7277a01cf2bfcc49b3873ed84c71646c86cf4795cb808728bfe40d9f9 not found: ID does not exist" Jan 31 10:37:08 crc kubenswrapper[4992]: I0131 10:37:08.312465 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grt8c"] Jan 31 10:37:08 crc kubenswrapper[4992]: I0131 10:37:08.321782 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-grt8c"] Jan 31 10:37:09 crc kubenswrapper[4992]: I0131 10:37:09.192500 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="872d176d-6650-4a80-bfed-7766f112f4b6" path="/var/lib/kubelet/pods/872d176d-6650-4a80-bfed-7766f112f4b6/volumes" Jan 31 10:37:15 crc kubenswrapper[4992]: I0131 10:37:15.301539 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:37:15 crc kubenswrapper[4992]: I0131 10:37:15.302535 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:37:45 crc kubenswrapper[4992]: I0131 10:37:45.301067 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:37:45 crc kubenswrapper[4992]: I0131 10:37:45.301686 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:37:45 crc kubenswrapper[4992]: I0131 10:37:45.301743 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 10:37:45 crc kubenswrapper[4992]: I0131 10:37:45.302566 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 10:37:45 crc kubenswrapper[4992]: I0131 10:37:45.302636 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" gracePeriod=600 Jan 31 10:37:45 crc kubenswrapper[4992]: E0131 10:37:45.436851 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:37:46 crc kubenswrapper[4992]: I0131 10:37:46.385784 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" exitCode=0 Jan 31 10:37:46 crc kubenswrapper[4992]: I0131 10:37:46.385876 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e"} Jan 31 10:37:46 crc kubenswrapper[4992]: I0131 10:37:46.386154 4992 scope.go:117] "RemoveContainer" containerID="0a0e3642d17199772632eec18bcc072f796c75cfc96e2995d9dbdd6cc4109275" Jan 31 10:37:46 crc kubenswrapper[4992]: I0131 10:37:46.388237 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:37:46 crc kubenswrapper[4992]: E0131 10:37:46.388614 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:37:59 crc kubenswrapper[4992]: I0131 10:37:59.183109 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:37:59 crc kubenswrapper[4992]: E0131 10:37:59.183975 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:38:10 crc kubenswrapper[4992]: I0131 10:38:10.183654 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:38:10 crc kubenswrapper[4992]: E0131 10:38:10.186302 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:38:21 crc kubenswrapper[4992]: I0131 10:38:21.182750 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:38:21 crc kubenswrapper[4992]: E0131 10:38:21.183524 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:38:32 crc kubenswrapper[4992]: I0131 10:38:32.183250 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:38:32 crc kubenswrapper[4992]: E0131 10:38:32.184081 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:38:45 crc kubenswrapper[4992]: I0131 10:38:45.191846 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:38:45 crc kubenswrapper[4992]: E0131 10:38:45.192898 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:38:57 crc kubenswrapper[4992]: I0131 10:38:57.182718 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:38:57 crc kubenswrapper[4992]: E0131 10:38:57.184187 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:39:02 crc kubenswrapper[4992]: I0131 10:39:02.176263 4992 generic.go:334] "Generic (PLEG): container finished" podID="8d0adaef-9167-4386-94f2-6e638914a9b5" containerID="1778fd88a7cc3c85cfb08535ee4bf23ecc7042b015ddf09acd7fc216292f171d" exitCode=0 Jan 31 10:39:02 crc kubenswrapper[4992]: I0131 10:39:02.176474 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"8d0adaef-9167-4386-94f2-6e638914a9b5","Type":"ContainerDied","Data":"1778fd88a7cc3c85cfb08535ee4bf23ecc7042b015ddf09acd7fc216292f171d"} Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.663483 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.763409 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-ephemeral-temporary\") pod \"8d0adaef-9167-4386-94f2-6e638914a9b5\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.763782 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-clouds-config\") pod \"8d0adaef-9167-4386-94f2-6e638914a9b5\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.763813 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-ephemeral-workdir\") pod \"8d0adaef-9167-4386-94f2-6e638914a9b5\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.763871 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfrmb\" (UniqueName: \"kubernetes.io/projected/8d0adaef-9167-4386-94f2-6e638914a9b5-kube-api-access-vfrmb\") pod \"8d0adaef-9167-4386-94f2-6e638914a9b5\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.763926 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-ceph\") pod \"8d0adaef-9167-4386-94f2-6e638914a9b5\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.763953 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "8d0adaef-9167-4386-94f2-6e638914a9b5" (UID: "8d0adaef-9167-4386-94f2-6e638914a9b5"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.764038 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-openstack-config-secret\") pod \"8d0adaef-9167-4386-94f2-6e638914a9b5\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.764068 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"8d0adaef-9167-4386-94f2-6e638914a9b5\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.764088 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-ca-certs\") pod \"8d0adaef-9167-4386-94f2-6e638914a9b5\" (UID: \"8d0adaef-9167-4386-94f2-6e638914a9b5\") " Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.764514 4992 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.770474 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "8d0adaef-9167-4386-94f2-6e638914a9b5" (UID: "8d0adaef-9167-4386-94f2-6e638914a9b5"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.778961 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d0adaef-9167-4386-94f2-6e638914a9b5-kube-api-access-vfrmb" (OuterVolumeSpecName: "kube-api-access-vfrmb") pod "8d0adaef-9167-4386-94f2-6e638914a9b5" (UID: "8d0adaef-9167-4386-94f2-6e638914a9b5"). InnerVolumeSpecName "kube-api-access-vfrmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.787593 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-ceph" (OuterVolumeSpecName: "ceph") pod "8d0adaef-9167-4386-94f2-6e638914a9b5" (UID: "8d0adaef-9167-4386-94f2-6e638914a9b5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.795622 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "8d0adaef-9167-4386-94f2-6e638914a9b5" (UID: "8d0adaef-9167-4386-94f2-6e638914a9b5"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.814888 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-clouds-config" (OuterVolumeSpecName: "test-operator-clouds-config") pod "8d0adaef-9167-4386-94f2-6e638914a9b5" (UID: "8d0adaef-9167-4386-94f2-6e638914a9b5"). InnerVolumeSpecName "test-operator-clouds-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.818005 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "8d0adaef-9167-4386-94f2-6e638914a9b5" (UID: "8d0adaef-9167-4386-94f2-6e638914a9b5"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.868163 4992 reconciler_common.go:293] "Volume detached for volume \"test-operator-clouds-config\" (UniqueName: \"kubernetes.io/configmap/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-clouds-config\") on node \"crc\" DevicePath \"\"" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.868194 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfrmb\" (UniqueName: \"kubernetes.io/projected/8d0adaef-9167-4386-94f2-6e638914a9b5-kube-api-access-vfrmb\") on node \"crc\" DevicePath \"\"" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.868205 4992 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-ceph\") on node \"crc\" DevicePath \"\"" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.868215 4992 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.868241 4992 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.868250 4992 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8d0adaef-9167-4386-94f2-6e638914a9b5-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.886547 4992 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.970481 4992 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 31 10:39:03 crc kubenswrapper[4992]: I0131 10:39:03.974500 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "8d0adaef-9167-4386-94f2-6e638914a9b5" (UID: "8d0adaef-9167-4386-94f2-6e638914a9b5"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:39:04 crc kubenswrapper[4992]: I0131 10:39:04.072811 4992 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8d0adaef-9167-4386-94f2-6e638914a9b5-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 31 10:39:04 crc kubenswrapper[4992]: I0131 10:39:04.206039 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizontest-tests-horizontest" event={"ID":"8d0adaef-9167-4386-94f2-6e638914a9b5","Type":"ContainerDied","Data":"f1fb9873d4085319f39fdd8386a3048b0eb6e727abadf947708549ce426bde4d"} Jan 31 10:39:04 crc kubenswrapper[4992]: I0131 10:39:04.206091 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1fb9873d4085319f39fdd8386a3048b0eb6e727abadf947708549ce426bde4d" Jan 31 10:39:04 crc kubenswrapper[4992]: I0131 10:39:04.206108 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizontest-tests-horizontest" Jan 31 10:39:12 crc kubenswrapper[4992]: I0131 10:39:12.183533 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:39:12 crc kubenswrapper[4992]: E0131 10:39:12.184454 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:39:14 crc kubenswrapper[4992]: I0131 10:39:14.871376 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest"] Jan 31 10:39:14 crc kubenswrapper[4992]: E0131 10:39:14.872237 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872d176d-6650-4a80-bfed-7766f112f4b6" containerName="registry-server" Jan 31 10:39:14 crc kubenswrapper[4992]: I0131 10:39:14.872250 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="872d176d-6650-4a80-bfed-7766f112f4b6" containerName="registry-server" Jan 31 10:39:14 crc kubenswrapper[4992]: E0131 10:39:14.872267 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872d176d-6650-4a80-bfed-7766f112f4b6" containerName="extract-utilities" Jan 31 10:39:14 crc kubenswrapper[4992]: I0131 10:39:14.872273 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="872d176d-6650-4a80-bfed-7766f112f4b6" containerName="extract-utilities" Jan 31 10:39:14 crc kubenswrapper[4992]: E0131 10:39:14.872281 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d0adaef-9167-4386-94f2-6e638914a9b5" containerName="horizontest-tests-horizontest" Jan 31 10:39:14 crc kubenswrapper[4992]: I0131 10:39:14.872288 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d0adaef-9167-4386-94f2-6e638914a9b5" containerName="horizontest-tests-horizontest" Jan 31 10:39:14 crc kubenswrapper[4992]: E0131 10:39:14.872296 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872d176d-6650-4a80-bfed-7766f112f4b6" containerName="extract-content" Jan 31 10:39:14 crc kubenswrapper[4992]: I0131 10:39:14.872301 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="872d176d-6650-4a80-bfed-7766f112f4b6" containerName="extract-content" Jan 31 10:39:14 crc kubenswrapper[4992]: I0131 10:39:14.872479 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d0adaef-9167-4386-94f2-6e638914a9b5" containerName="horizontest-tests-horizontest" Jan 31 10:39:14 crc kubenswrapper[4992]: I0131 10:39:14.872501 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="872d176d-6650-4a80-bfed-7766f112f4b6" containerName="registry-server" Jan 31 10:39:14 crc kubenswrapper[4992]: I0131 10:39:14.873066 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Jan 31 10:39:14 crc kubenswrapper[4992]: I0131 10:39:14.879480 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest"] Jan 31 10:39:15 crc kubenswrapper[4992]: I0131 10:39:15.002996 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"bd7ef245-70f0-447f-b0f5-8e7d7df4325f\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Jan 31 10:39:15 crc kubenswrapper[4992]: I0131 10:39:15.003102 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfz7h\" (UniqueName: \"kubernetes.io/projected/bd7ef245-70f0-447f-b0f5-8e7d7df4325f-kube-api-access-sfz7h\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"bd7ef245-70f0-447f-b0f5-8e7d7df4325f\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Jan 31 10:39:15 crc kubenswrapper[4992]: I0131 10:39:15.105648 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"bd7ef245-70f0-447f-b0f5-8e7d7df4325f\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Jan 31 10:39:15 crc kubenswrapper[4992]: I0131 10:39:15.105748 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfz7h\" (UniqueName: \"kubernetes.io/projected/bd7ef245-70f0-447f-b0f5-8e7d7df4325f-kube-api-access-sfz7h\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"bd7ef245-70f0-447f-b0f5-8e7d7df4325f\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Jan 31 10:39:15 crc kubenswrapper[4992]: I0131 10:39:15.106511 4992 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"bd7ef245-70f0-447f-b0f5-8e7d7df4325f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Jan 31 10:39:15 crc kubenswrapper[4992]: I0131 10:39:15.130889 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfz7h\" (UniqueName: \"kubernetes.io/projected/bd7ef245-70f0-447f-b0f5-8e7d7df4325f-kube-api-access-sfz7h\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"bd7ef245-70f0-447f-b0f5-8e7d7df4325f\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Jan 31 10:39:15 crc kubenswrapper[4992]: I0131 10:39:15.160581 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-horizontest-horizontest-tests-horizontest\" (UID: \"bd7ef245-70f0-447f-b0f5-8e7d7df4325f\") " pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Jan 31 10:39:15 crc kubenswrapper[4992]: I0131 10:39:15.196819 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" Jan 31 10:39:15 crc kubenswrapper[4992]: E0131 10:39:15.196919 4992 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Jan 31 10:39:15 crc kubenswrapper[4992]: I0131 10:39:15.730707 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest"] Jan 31 10:39:15 crc kubenswrapper[4992]: E0131 10:39:15.740633 4992 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Jan 31 10:39:16 crc kubenswrapper[4992]: E0131 10:39:16.151355 4992 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Jan 31 10:39:16 crc kubenswrapper[4992]: I0131 10:39:16.371690 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" event={"ID":"bd7ef245-70f0-447f-b0f5-8e7d7df4325f","Type":"ContainerStarted","Data":"2bf4dfb86ab80f7ae7505fbf9355a92a0671ff8b8bf085da9e4b90876af76b77"} Jan 31 10:39:17 crc kubenswrapper[4992]: I0131 10:39:17.387719 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" event={"ID":"bd7ef245-70f0-447f-b0f5-8e7d7df4325f","Type":"ContainerStarted","Data":"8b67068ebd66fecde91bf9dccb9673354010ed2ae4c3dbc75fcccab50d71cda8"} Jan 31 10:39:17 crc kubenswrapper[4992]: E0131 10:39:17.389742 4992 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Jan 31 10:39:17 crc kubenswrapper[4992]: I0131 10:39:17.415796 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-horizontest-horizontest-tests-horizontest" podStartSLOduration=3.006551309 podStartE2EDuration="3.415772307s" podCreationTimestamp="2026-01-31 10:39:14 +0000 UTC" firstStartedPulling="2026-01-31 10:39:15.742043484 +0000 UTC m=+4451.713435471" lastFinishedPulling="2026-01-31 10:39:16.151264442 +0000 UTC m=+4452.122656469" observedRunningTime="2026-01-31 10:39:17.407898453 +0000 UTC m=+4453.379290520" watchObservedRunningTime="2026-01-31 10:39:17.415772307 +0000 UTC m=+4453.387164324" Jan 31 10:39:18 crc kubenswrapper[4992]: E0131 10:39:18.400722 4992 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Jan 31 10:39:24 crc kubenswrapper[4992]: I0131 10:39:24.183681 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:39:24 crc kubenswrapper[4992]: E0131 10:39:24.184774 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:39:36 crc kubenswrapper[4992]: I0131 10:39:36.183033 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:39:36 crc kubenswrapper[4992]: E0131 10:39:36.183741 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:39:48 crc kubenswrapper[4992]: I0131 10:39:48.183804 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:39:48 crc kubenswrapper[4992]: E0131 10:39:48.185084 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:39:57 crc kubenswrapper[4992]: I0131 10:39:57.011387 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-z4695/must-gather-lchbd"] Jan 31 10:39:57 crc kubenswrapper[4992]: I0131 10:39:57.014732 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4695/must-gather-lchbd" Jan 31 10:39:57 crc kubenswrapper[4992]: I0131 10:39:57.017188 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-z4695"/"openshift-service-ca.crt" Jan 31 10:39:57 crc kubenswrapper[4992]: I0131 10:39:57.018190 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-z4695"/"default-dockercfg-8qjbz" Jan 31 10:39:57 crc kubenswrapper[4992]: I0131 10:39:57.018957 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-z4695"/"kube-root-ca.crt" Jan 31 10:39:57 crc kubenswrapper[4992]: I0131 10:39:57.021019 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-z4695/must-gather-lchbd"] Jan 31 10:39:57 crc kubenswrapper[4992]: I0131 10:39:57.148406 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqvz7\" (UniqueName: \"kubernetes.io/projected/5207e946-075f-4090-9596-ba1db236cb90-kube-api-access-jqvz7\") pod \"must-gather-lchbd\" (UID: \"5207e946-075f-4090-9596-ba1db236cb90\") " pod="openshift-must-gather-z4695/must-gather-lchbd" Jan 31 10:39:57 crc kubenswrapper[4992]: I0131 10:39:57.148490 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5207e946-075f-4090-9596-ba1db236cb90-must-gather-output\") pod \"must-gather-lchbd\" (UID: \"5207e946-075f-4090-9596-ba1db236cb90\") " pod="openshift-must-gather-z4695/must-gather-lchbd" Jan 31 10:39:57 crc kubenswrapper[4992]: I0131 10:39:57.250142 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqvz7\" (UniqueName: \"kubernetes.io/projected/5207e946-075f-4090-9596-ba1db236cb90-kube-api-access-jqvz7\") pod \"must-gather-lchbd\" (UID: \"5207e946-075f-4090-9596-ba1db236cb90\") " pod="openshift-must-gather-z4695/must-gather-lchbd" Jan 31 10:39:57 crc kubenswrapper[4992]: I0131 10:39:57.250205 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5207e946-075f-4090-9596-ba1db236cb90-must-gather-output\") pod \"must-gather-lchbd\" (UID: \"5207e946-075f-4090-9596-ba1db236cb90\") " pod="openshift-must-gather-z4695/must-gather-lchbd" Jan 31 10:39:57 crc kubenswrapper[4992]: I0131 10:39:57.250735 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5207e946-075f-4090-9596-ba1db236cb90-must-gather-output\") pod \"must-gather-lchbd\" (UID: \"5207e946-075f-4090-9596-ba1db236cb90\") " pod="openshift-must-gather-z4695/must-gather-lchbd" Jan 31 10:39:57 crc kubenswrapper[4992]: I0131 10:39:57.270046 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqvz7\" (UniqueName: \"kubernetes.io/projected/5207e946-075f-4090-9596-ba1db236cb90-kube-api-access-jqvz7\") pod \"must-gather-lchbd\" (UID: \"5207e946-075f-4090-9596-ba1db236cb90\") " pod="openshift-must-gather-z4695/must-gather-lchbd" Jan 31 10:39:57 crc kubenswrapper[4992]: I0131 10:39:57.330768 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4695/must-gather-lchbd" Jan 31 10:39:57 crc kubenswrapper[4992]: I0131 10:39:57.930171 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-z4695/must-gather-lchbd"] Jan 31 10:39:58 crc kubenswrapper[4992]: I0131 10:39:58.883792 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4695/must-gather-lchbd" event={"ID":"5207e946-075f-4090-9596-ba1db236cb90","Type":"ContainerStarted","Data":"2ce4f710fead01ad04864e663ddf97d33487884d7b7e95fd949aa08416c1d4cb"} Jan 31 10:40:02 crc kubenswrapper[4992]: I0131 10:40:02.183716 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:40:02 crc kubenswrapper[4992]: E0131 10:40:02.185498 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:40:02 crc kubenswrapper[4992]: I0131 10:40:02.926105 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4695/must-gather-lchbd" event={"ID":"5207e946-075f-4090-9596-ba1db236cb90","Type":"ContainerStarted","Data":"4711d7b80a05aebf938de7f0ef1adb570ca64a6fe390fceef42a9d33f41d6c90"} Jan 31 10:40:02 crc kubenswrapper[4992]: I0131 10:40:02.927010 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4695/must-gather-lchbd" event={"ID":"5207e946-075f-4090-9596-ba1db236cb90","Type":"ContainerStarted","Data":"d28f4d5021be4befe3f55d259ffdc4ed5d31c537b49f43223c0af04bd20dcd38"} Jan 31 10:40:02 crc kubenswrapper[4992]: I0131 10:40:02.956326 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-z4695/must-gather-lchbd" podStartSLOduration=3.183993333 podStartE2EDuration="6.956304032s" podCreationTimestamp="2026-01-31 10:39:56 +0000 UTC" firstStartedPulling="2026-01-31 10:39:58.284880709 +0000 UTC m=+4494.256272736" lastFinishedPulling="2026-01-31 10:40:02.057191448 +0000 UTC m=+4498.028583435" observedRunningTime="2026-01-31 10:40:02.950847066 +0000 UTC m=+4498.922239063" watchObservedRunningTime="2026-01-31 10:40:02.956304032 +0000 UTC m=+4498.927696039" Jan 31 10:40:05 crc kubenswrapper[4992]: E0131 10:40:05.690967 4992 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.243:51038->38.129.56.243:43563: write tcp 38.129.56.243:51038->38.129.56.243:43563: write: broken pipe Jan 31 10:40:07 crc kubenswrapper[4992]: I0131 10:40:07.414465 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-z4695/crc-debug-jvpcp"] Jan 31 10:40:07 crc kubenswrapper[4992]: I0131 10:40:07.416874 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4695/crc-debug-jvpcp" Jan 31 10:40:07 crc kubenswrapper[4992]: I0131 10:40:07.571544 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2bbs\" (UniqueName: \"kubernetes.io/projected/9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c-kube-api-access-p2bbs\") pod \"crc-debug-jvpcp\" (UID: \"9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c\") " pod="openshift-must-gather-z4695/crc-debug-jvpcp" Jan 31 10:40:07 crc kubenswrapper[4992]: I0131 10:40:07.571872 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c-host\") pod \"crc-debug-jvpcp\" (UID: \"9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c\") " pod="openshift-must-gather-z4695/crc-debug-jvpcp" Jan 31 10:40:07 crc kubenswrapper[4992]: I0131 10:40:07.673993 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2bbs\" (UniqueName: \"kubernetes.io/projected/9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c-kube-api-access-p2bbs\") pod \"crc-debug-jvpcp\" (UID: \"9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c\") " pod="openshift-must-gather-z4695/crc-debug-jvpcp" Jan 31 10:40:07 crc kubenswrapper[4992]: I0131 10:40:07.674102 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c-host\") pod \"crc-debug-jvpcp\" (UID: \"9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c\") " pod="openshift-must-gather-z4695/crc-debug-jvpcp" Jan 31 10:40:07 crc kubenswrapper[4992]: I0131 10:40:07.674224 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c-host\") pod \"crc-debug-jvpcp\" (UID: \"9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c\") " pod="openshift-must-gather-z4695/crc-debug-jvpcp" Jan 31 10:40:07 crc kubenswrapper[4992]: I0131 10:40:07.693533 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2bbs\" (UniqueName: \"kubernetes.io/projected/9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c-kube-api-access-p2bbs\") pod \"crc-debug-jvpcp\" (UID: \"9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c\") " pod="openshift-must-gather-z4695/crc-debug-jvpcp" Jan 31 10:40:07 crc kubenswrapper[4992]: I0131 10:40:07.733505 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4695/crc-debug-jvpcp" Jan 31 10:40:07 crc kubenswrapper[4992]: I0131 10:40:07.989589 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4695/crc-debug-jvpcp" event={"ID":"9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c","Type":"ContainerStarted","Data":"2a8f41166bde620bfe6217d652b76249c7ad1ebd5dfce2c2f5789b2ad3116774"} Jan 31 10:40:13 crc kubenswrapper[4992]: I0131 10:40:13.182668 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:40:13 crc kubenswrapper[4992]: E0131 10:40:13.183291 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:40:19 crc kubenswrapper[4992]: I0131 10:40:19.102123 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4695/crc-debug-jvpcp" event={"ID":"9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c","Type":"ContainerStarted","Data":"843bbbb3f396e273fa8fb8463c794196fd8c1a3ce99fcc8352815e010837daff"} Jan 31 10:40:19 crc kubenswrapper[4992]: I0131 10:40:19.118124 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-z4695/crc-debug-jvpcp" podStartSLOduration=1.095981804 podStartE2EDuration="12.118086387s" podCreationTimestamp="2026-01-31 10:40:07 +0000 UTC" firstStartedPulling="2026-01-31 10:40:07.759449144 +0000 UTC m=+4503.730841131" lastFinishedPulling="2026-01-31 10:40:18.781553727 +0000 UTC m=+4514.752945714" observedRunningTime="2026-01-31 10:40:19.113839666 +0000 UTC m=+4515.085231673" watchObservedRunningTime="2026-01-31 10:40:19.118086387 +0000 UTC m=+4515.089478374" Jan 31 10:40:19 crc kubenswrapper[4992]: E0131 10:40:19.183324 4992 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Jan 31 10:40:25 crc kubenswrapper[4992]: I0131 10:40:25.189177 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:40:25 crc kubenswrapper[4992]: E0131 10:40:25.190473 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:40:39 crc kubenswrapper[4992]: I0131 10:40:39.182338 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:40:39 crc kubenswrapper[4992]: E0131 10:40:39.183176 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:40:40 crc kubenswrapper[4992]: I0131 10:40:40.276310 4992 generic.go:334] "Generic (PLEG): container finished" podID="9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c" containerID="843bbbb3f396e273fa8fb8463c794196fd8c1a3ce99fcc8352815e010837daff" exitCode=0 Jan 31 10:40:40 crc kubenswrapper[4992]: I0131 10:40:40.276405 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4695/crc-debug-jvpcp" event={"ID":"9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c","Type":"ContainerDied","Data":"843bbbb3f396e273fa8fb8463c794196fd8c1a3ce99fcc8352815e010837daff"} Jan 31 10:40:41 crc kubenswrapper[4992]: I0131 10:40:41.408672 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4695/crc-debug-jvpcp" Jan 31 10:40:41 crc kubenswrapper[4992]: I0131 10:40:41.435639 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2bbs\" (UniqueName: \"kubernetes.io/projected/9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c-kube-api-access-p2bbs\") pod \"9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c\" (UID: \"9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c\") " Jan 31 10:40:41 crc kubenswrapper[4992]: I0131 10:40:41.435921 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c-host\") pod \"9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c\" (UID: \"9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c\") " Jan 31 10:40:41 crc kubenswrapper[4992]: I0131 10:40:41.436830 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c-host" (OuterVolumeSpecName: "host") pod "9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c" (UID: "9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 10:40:41 crc kubenswrapper[4992]: I0131 10:40:41.444578 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c-kube-api-access-p2bbs" (OuterVolumeSpecName: "kube-api-access-p2bbs") pod "9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c" (UID: "9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c"). InnerVolumeSpecName "kube-api-access-p2bbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:40:41 crc kubenswrapper[4992]: I0131 10:40:41.451045 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-z4695/crc-debug-jvpcp"] Jan 31 10:40:41 crc kubenswrapper[4992]: I0131 10:40:41.460449 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-z4695/crc-debug-jvpcp"] Jan 31 10:40:41 crc kubenswrapper[4992]: I0131 10:40:41.538008 4992 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c-host\") on node \"crc\" DevicePath \"\"" Jan 31 10:40:41 crc kubenswrapper[4992]: I0131 10:40:41.538232 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2bbs\" (UniqueName: \"kubernetes.io/projected/9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c-kube-api-access-p2bbs\") on node \"crc\" DevicePath \"\"" Jan 31 10:40:42 crc kubenswrapper[4992]: I0131 10:40:42.297574 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a8f41166bde620bfe6217d652b76249c7ad1ebd5dfce2c2f5789b2ad3116774" Jan 31 10:40:42 crc kubenswrapper[4992]: I0131 10:40:42.297654 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4695/crc-debug-jvpcp" Jan 31 10:40:42 crc kubenswrapper[4992]: I0131 10:40:42.633975 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-z4695/crc-debug-f9xhv"] Jan 31 10:40:42 crc kubenswrapper[4992]: E0131 10:40:42.634748 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c" containerName="container-00" Jan 31 10:40:42 crc kubenswrapper[4992]: I0131 10:40:42.634762 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c" containerName="container-00" Jan 31 10:40:42 crc kubenswrapper[4992]: I0131 10:40:42.634950 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c" containerName="container-00" Jan 31 10:40:42 crc kubenswrapper[4992]: I0131 10:40:42.635574 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4695/crc-debug-f9xhv" Jan 31 10:40:42 crc kubenswrapper[4992]: I0131 10:40:42.762240 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn6k9\" (UniqueName: \"kubernetes.io/projected/8c7be11f-3892-4a5a-a5ba-8834312a11ca-kube-api-access-qn6k9\") pod \"crc-debug-f9xhv\" (UID: \"8c7be11f-3892-4a5a-a5ba-8834312a11ca\") " pod="openshift-must-gather-z4695/crc-debug-f9xhv" Jan 31 10:40:42 crc kubenswrapper[4992]: I0131 10:40:42.762375 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c7be11f-3892-4a5a-a5ba-8834312a11ca-host\") pod \"crc-debug-f9xhv\" (UID: \"8c7be11f-3892-4a5a-a5ba-8834312a11ca\") " pod="openshift-must-gather-z4695/crc-debug-f9xhv" Jan 31 10:40:42 crc kubenswrapper[4992]: I0131 10:40:42.864413 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn6k9\" (UniqueName: \"kubernetes.io/projected/8c7be11f-3892-4a5a-a5ba-8834312a11ca-kube-api-access-qn6k9\") pod \"crc-debug-f9xhv\" (UID: \"8c7be11f-3892-4a5a-a5ba-8834312a11ca\") " pod="openshift-must-gather-z4695/crc-debug-f9xhv" Jan 31 10:40:42 crc kubenswrapper[4992]: I0131 10:40:42.864774 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c7be11f-3892-4a5a-a5ba-8834312a11ca-host\") pod \"crc-debug-f9xhv\" (UID: \"8c7be11f-3892-4a5a-a5ba-8834312a11ca\") " pod="openshift-must-gather-z4695/crc-debug-f9xhv" Jan 31 10:40:42 crc kubenswrapper[4992]: I0131 10:40:42.864914 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c7be11f-3892-4a5a-a5ba-8834312a11ca-host\") pod \"crc-debug-f9xhv\" (UID: \"8c7be11f-3892-4a5a-a5ba-8834312a11ca\") " pod="openshift-must-gather-z4695/crc-debug-f9xhv" Jan 31 10:40:42 crc kubenswrapper[4992]: I0131 10:40:42.894157 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn6k9\" (UniqueName: \"kubernetes.io/projected/8c7be11f-3892-4a5a-a5ba-8834312a11ca-kube-api-access-qn6k9\") pod \"crc-debug-f9xhv\" (UID: \"8c7be11f-3892-4a5a-a5ba-8834312a11ca\") " pod="openshift-must-gather-z4695/crc-debug-f9xhv" Jan 31 10:40:42 crc kubenswrapper[4992]: I0131 10:40:42.951890 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4695/crc-debug-f9xhv" Jan 31 10:40:43 crc kubenswrapper[4992]: I0131 10:40:43.194800 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c" path="/var/lib/kubelet/pods/9a04eed5-5a81-4ebb-9fe8-31cc8152eb0c/volumes" Jan 31 10:40:43 crc kubenswrapper[4992]: I0131 10:40:43.307831 4992 generic.go:334] "Generic (PLEG): container finished" podID="8c7be11f-3892-4a5a-a5ba-8834312a11ca" containerID="90a6dcca95371a10fa1ec89578cf84127b0e6aed894d8abbd425558f3c740aee" exitCode=1 Jan 31 10:40:43 crc kubenswrapper[4992]: I0131 10:40:43.307879 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4695/crc-debug-f9xhv" event={"ID":"8c7be11f-3892-4a5a-a5ba-8834312a11ca","Type":"ContainerDied","Data":"90a6dcca95371a10fa1ec89578cf84127b0e6aed894d8abbd425558f3c740aee"} Jan 31 10:40:43 crc kubenswrapper[4992]: I0131 10:40:43.307911 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4695/crc-debug-f9xhv" event={"ID":"8c7be11f-3892-4a5a-a5ba-8834312a11ca","Type":"ContainerStarted","Data":"00f996625f810a47afbbcb70c77bedc9a9b8151ffba1e40fda4b7158a7d753d5"} Jan 31 10:40:43 crc kubenswrapper[4992]: I0131 10:40:43.355928 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-z4695/crc-debug-f9xhv"] Jan 31 10:40:43 crc kubenswrapper[4992]: I0131 10:40:43.364334 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-z4695/crc-debug-f9xhv"] Jan 31 10:40:44 crc kubenswrapper[4992]: I0131 10:40:44.422169 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4695/crc-debug-f9xhv" Jan 31 10:40:44 crc kubenswrapper[4992]: I0131 10:40:44.495611 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c7be11f-3892-4a5a-a5ba-8834312a11ca-host\") pod \"8c7be11f-3892-4a5a-a5ba-8834312a11ca\" (UID: \"8c7be11f-3892-4a5a-a5ba-8834312a11ca\") " Jan 31 10:40:44 crc kubenswrapper[4992]: I0131 10:40:44.495812 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn6k9\" (UniqueName: \"kubernetes.io/projected/8c7be11f-3892-4a5a-a5ba-8834312a11ca-kube-api-access-qn6k9\") pod \"8c7be11f-3892-4a5a-a5ba-8834312a11ca\" (UID: \"8c7be11f-3892-4a5a-a5ba-8834312a11ca\") " Jan 31 10:40:44 crc kubenswrapper[4992]: I0131 10:40:44.496475 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c7be11f-3892-4a5a-a5ba-8834312a11ca-host" (OuterVolumeSpecName: "host") pod "8c7be11f-3892-4a5a-a5ba-8834312a11ca" (UID: "8c7be11f-3892-4a5a-a5ba-8834312a11ca"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 10:40:44 crc kubenswrapper[4992]: I0131 10:40:44.506077 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c7be11f-3892-4a5a-a5ba-8834312a11ca-kube-api-access-qn6k9" (OuterVolumeSpecName: "kube-api-access-qn6k9") pod "8c7be11f-3892-4a5a-a5ba-8834312a11ca" (UID: "8c7be11f-3892-4a5a-a5ba-8834312a11ca"). InnerVolumeSpecName "kube-api-access-qn6k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:40:44 crc kubenswrapper[4992]: I0131 10:40:44.597824 4992 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c7be11f-3892-4a5a-a5ba-8834312a11ca-host\") on node \"crc\" DevicePath \"\"" Jan 31 10:40:44 crc kubenswrapper[4992]: I0131 10:40:44.597853 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn6k9\" (UniqueName: \"kubernetes.io/projected/8c7be11f-3892-4a5a-a5ba-8834312a11ca-kube-api-access-qn6k9\") on node \"crc\" DevicePath \"\"" Jan 31 10:40:45 crc kubenswrapper[4992]: I0131 10:40:45.202716 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c7be11f-3892-4a5a-a5ba-8834312a11ca" path="/var/lib/kubelet/pods/8c7be11f-3892-4a5a-a5ba-8834312a11ca/volumes" Jan 31 10:40:45 crc kubenswrapper[4992]: I0131 10:40:45.327510 4992 scope.go:117] "RemoveContainer" containerID="90a6dcca95371a10fa1ec89578cf84127b0e6aed894d8abbd425558f3c740aee" Jan 31 10:40:45 crc kubenswrapper[4992]: I0131 10:40:45.327554 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4695/crc-debug-f9xhv" Jan 31 10:40:52 crc kubenswrapper[4992]: I0131 10:40:52.182655 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:40:52 crc kubenswrapper[4992]: E0131 10:40:52.183407 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:41:07 crc kubenswrapper[4992]: I0131 10:41:07.182787 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:41:07 crc kubenswrapper[4992]: E0131 10:41:07.184402 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:41:19 crc kubenswrapper[4992]: I0131 10:41:19.182910 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:41:19 crc kubenswrapper[4992]: E0131 10:41:19.183783 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:41:34 crc kubenswrapper[4992]: I0131 10:41:34.183123 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:41:34 crc kubenswrapper[4992]: E0131 10:41:34.184000 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:41:36 crc kubenswrapper[4992]: E0131 10:41:36.184081 4992 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.121381 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-68xqk"] Jan 31 10:41:45 crc kubenswrapper[4992]: E0131 10:41:45.122492 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7be11f-3892-4a5a-a5ba-8834312a11ca" containerName="container-00" Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.122508 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7be11f-3892-4a5a-a5ba-8834312a11ca" containerName="container-00" Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.122744 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7be11f-3892-4a5a-a5ba-8834312a11ca" containerName="container-00" Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.125822 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.133175 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-68xqk"] Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.209393 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-catalog-content\") pod \"redhat-marketplace-68xqk\" (UID: \"63182ea9-4017-41c5-96ec-08c4f2d7fe0b\") " pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.209476 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s692w\" (UniqueName: \"kubernetes.io/projected/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-kube-api-access-s692w\") pod \"redhat-marketplace-68xqk\" (UID: \"63182ea9-4017-41c5-96ec-08c4f2d7fe0b\") " pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.209573 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-utilities\") pod \"redhat-marketplace-68xqk\" (UID: \"63182ea9-4017-41c5-96ec-08c4f2d7fe0b\") " pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.311889 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-catalog-content\") pod \"redhat-marketplace-68xqk\" (UID: \"63182ea9-4017-41c5-96ec-08c4f2d7fe0b\") " pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.311951 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s692w\" (UniqueName: \"kubernetes.io/projected/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-kube-api-access-s692w\") pod \"redhat-marketplace-68xqk\" (UID: \"63182ea9-4017-41c5-96ec-08c4f2d7fe0b\") " pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.312006 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-utilities\") pod \"redhat-marketplace-68xqk\" (UID: \"63182ea9-4017-41c5-96ec-08c4f2d7fe0b\") " pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.312389 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-utilities\") pod \"redhat-marketplace-68xqk\" (UID: \"63182ea9-4017-41c5-96ec-08c4f2d7fe0b\") " pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.312483 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-catalog-content\") pod \"redhat-marketplace-68xqk\" (UID: \"63182ea9-4017-41c5-96ec-08c4f2d7fe0b\") " pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.333545 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s692w\" (UniqueName: \"kubernetes.io/projected/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-kube-api-access-s692w\") pod \"redhat-marketplace-68xqk\" (UID: \"63182ea9-4017-41c5-96ec-08c4f2d7fe0b\") " pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.457217 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.946295 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-68xqk"] Jan 31 10:41:45 crc kubenswrapper[4992]: I0131 10:41:45.956893 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68xqk" event={"ID":"63182ea9-4017-41c5-96ec-08c4f2d7fe0b","Type":"ContainerStarted","Data":"d24b7f9268bbd5935a61712a0ffb63ff2bdc3cefc15314b780818bb6ae0f20bd"} Jan 31 10:41:46 crc kubenswrapper[4992]: I0131 10:41:46.182537 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:41:46 crc kubenswrapper[4992]: E0131 10:41:46.183039 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:41:46 crc kubenswrapper[4992]: I0131 10:41:46.972397 4992 generic.go:334] "Generic (PLEG): container finished" podID="63182ea9-4017-41c5-96ec-08c4f2d7fe0b" containerID="85afdf08731b861bf182298ecff31639e6cf606f29b33686130f38b26f153301" exitCode=0 Jan 31 10:41:46 crc kubenswrapper[4992]: I0131 10:41:46.972489 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68xqk" event={"ID":"63182ea9-4017-41c5-96ec-08c4f2d7fe0b","Type":"ContainerDied","Data":"85afdf08731b861bf182298ecff31639e6cf606f29b33686130f38b26f153301"} Jan 31 10:41:46 crc kubenswrapper[4992]: I0131 10:41:46.976006 4992 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 10:41:48 crc kubenswrapper[4992]: I0131 10:41:48.991106 4992 generic.go:334] "Generic (PLEG): container finished" podID="63182ea9-4017-41c5-96ec-08c4f2d7fe0b" containerID="6bee36f0c5c54343dbc6908577be9b8bde2b1e2588658e5f81104d8448562605" exitCode=0 Jan 31 10:41:48 crc kubenswrapper[4992]: I0131 10:41:48.991142 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68xqk" event={"ID":"63182ea9-4017-41c5-96ec-08c4f2d7fe0b","Type":"ContainerDied","Data":"6bee36f0c5c54343dbc6908577be9b8bde2b1e2588658e5f81104d8448562605"} Jan 31 10:41:50 crc kubenswrapper[4992]: I0131 10:41:50.003465 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68xqk" event={"ID":"63182ea9-4017-41c5-96ec-08c4f2d7fe0b","Type":"ContainerStarted","Data":"a9704f4f62c8ab2b1f51d16f1d4ce6c9f950e149ff5239d66d18011ed9b86b84"} Jan 31 10:41:50 crc kubenswrapper[4992]: I0131 10:41:50.033450 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-68xqk" podStartSLOduration=2.633410844 podStartE2EDuration="5.033433701s" podCreationTimestamp="2026-01-31 10:41:45 +0000 UTC" firstStartedPulling="2026-01-31 10:41:46.975593309 +0000 UTC m=+4602.946985326" lastFinishedPulling="2026-01-31 10:41:49.375616186 +0000 UTC m=+4605.347008183" observedRunningTime="2026-01-31 10:41:50.031172527 +0000 UTC m=+4606.002564514" watchObservedRunningTime="2026-01-31 10:41:50.033433701 +0000 UTC m=+4606.004825688" Jan 31 10:41:55 crc kubenswrapper[4992]: I0131 10:41:55.458347 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:55 crc kubenswrapper[4992]: I0131 10:41:55.459028 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:55 crc kubenswrapper[4992]: I0131 10:41:55.562140 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:56 crc kubenswrapper[4992]: I0131 10:41:56.121153 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:56 crc kubenswrapper[4992]: I0131 10:41:56.184523 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-68xqk"] Jan 31 10:41:58 crc kubenswrapper[4992]: I0131 10:41:58.091732 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-68xqk" podUID="63182ea9-4017-41c5-96ec-08c4f2d7fe0b" containerName="registry-server" containerID="cri-o://a9704f4f62c8ab2b1f51d16f1d4ce6c9f950e149ff5239d66d18011ed9b86b84" gracePeriod=2 Jan 31 10:41:58 crc kubenswrapper[4992]: I0131 10:41:58.194924 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:41:58 crc kubenswrapper[4992]: E0131 10:41:58.195743 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:41:58 crc kubenswrapper[4992]: I0131 10:41:58.575801 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:58 crc kubenswrapper[4992]: I0131 10:41:58.688163 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s692w\" (UniqueName: \"kubernetes.io/projected/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-kube-api-access-s692w\") pod \"63182ea9-4017-41c5-96ec-08c4f2d7fe0b\" (UID: \"63182ea9-4017-41c5-96ec-08c4f2d7fe0b\") " Jan 31 10:41:58 crc kubenswrapper[4992]: I0131 10:41:58.688309 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-catalog-content\") pod \"63182ea9-4017-41c5-96ec-08c4f2d7fe0b\" (UID: \"63182ea9-4017-41c5-96ec-08c4f2d7fe0b\") " Jan 31 10:41:58 crc kubenswrapper[4992]: I0131 10:41:58.688338 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-utilities\") pod \"63182ea9-4017-41c5-96ec-08c4f2d7fe0b\" (UID: \"63182ea9-4017-41c5-96ec-08c4f2d7fe0b\") " Jan 31 10:41:58 crc kubenswrapper[4992]: I0131 10:41:58.689327 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-utilities" (OuterVolumeSpecName: "utilities") pod "63182ea9-4017-41c5-96ec-08c4f2d7fe0b" (UID: "63182ea9-4017-41c5-96ec-08c4f2d7fe0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:41:58 crc kubenswrapper[4992]: I0131 10:41:58.697852 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-kube-api-access-s692w" (OuterVolumeSpecName: "kube-api-access-s692w") pod "63182ea9-4017-41c5-96ec-08c4f2d7fe0b" (UID: "63182ea9-4017-41c5-96ec-08c4f2d7fe0b"). InnerVolumeSpecName "kube-api-access-s692w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:41:58 crc kubenswrapper[4992]: I0131 10:41:58.717889 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63182ea9-4017-41c5-96ec-08c4f2d7fe0b" (UID: "63182ea9-4017-41c5-96ec-08c4f2d7fe0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:41:58 crc kubenswrapper[4992]: I0131 10:41:58.790428 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s692w\" (UniqueName: \"kubernetes.io/projected/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-kube-api-access-s692w\") on node \"crc\" DevicePath \"\"" Jan 31 10:41:58 crc kubenswrapper[4992]: I0131 10:41:58.790468 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:41:58 crc kubenswrapper[4992]: I0131 10:41:58.790478 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63182ea9-4017-41c5-96ec-08c4f2d7fe0b-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.101003 4992 generic.go:334] "Generic (PLEG): container finished" podID="63182ea9-4017-41c5-96ec-08c4f2d7fe0b" containerID="a9704f4f62c8ab2b1f51d16f1d4ce6c9f950e149ff5239d66d18011ed9b86b84" exitCode=0 Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.101046 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68xqk" event={"ID":"63182ea9-4017-41c5-96ec-08c4f2d7fe0b","Type":"ContainerDied","Data":"a9704f4f62c8ab2b1f51d16f1d4ce6c9f950e149ff5239d66d18011ed9b86b84"} Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.101085 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68xqk" event={"ID":"63182ea9-4017-41c5-96ec-08c4f2d7fe0b","Type":"ContainerDied","Data":"d24b7f9268bbd5935a61712a0ffb63ff2bdc3cefc15314b780818bb6ae0f20bd"} Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.101098 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-68xqk" Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.101122 4992 scope.go:117] "RemoveContainer" containerID="a9704f4f62c8ab2b1f51d16f1d4ce6c9f950e149ff5239d66d18011ed9b86b84" Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.121151 4992 scope.go:117] "RemoveContainer" containerID="6bee36f0c5c54343dbc6908577be9b8bde2b1e2588658e5f81104d8448562605" Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.133480 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-68xqk"] Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.153542 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-68xqk"] Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.153992 4992 scope.go:117] "RemoveContainer" containerID="85afdf08731b861bf182298ecff31639e6cf606f29b33686130f38b26f153301" Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.195278 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63182ea9-4017-41c5-96ec-08c4f2d7fe0b" path="/var/lib/kubelet/pods/63182ea9-4017-41c5-96ec-08c4f2d7fe0b/volumes" Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.209238 4992 scope.go:117] "RemoveContainer" containerID="a9704f4f62c8ab2b1f51d16f1d4ce6c9f950e149ff5239d66d18011ed9b86b84" Jan 31 10:41:59 crc kubenswrapper[4992]: E0131 10:41:59.209767 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9704f4f62c8ab2b1f51d16f1d4ce6c9f950e149ff5239d66d18011ed9b86b84\": container with ID starting with a9704f4f62c8ab2b1f51d16f1d4ce6c9f950e149ff5239d66d18011ed9b86b84 not found: ID does not exist" containerID="a9704f4f62c8ab2b1f51d16f1d4ce6c9f950e149ff5239d66d18011ed9b86b84" Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.209816 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9704f4f62c8ab2b1f51d16f1d4ce6c9f950e149ff5239d66d18011ed9b86b84"} err="failed to get container status \"a9704f4f62c8ab2b1f51d16f1d4ce6c9f950e149ff5239d66d18011ed9b86b84\": rpc error: code = NotFound desc = could not find container \"a9704f4f62c8ab2b1f51d16f1d4ce6c9f950e149ff5239d66d18011ed9b86b84\": container with ID starting with a9704f4f62c8ab2b1f51d16f1d4ce6c9f950e149ff5239d66d18011ed9b86b84 not found: ID does not exist" Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.209849 4992 scope.go:117] "RemoveContainer" containerID="6bee36f0c5c54343dbc6908577be9b8bde2b1e2588658e5f81104d8448562605" Jan 31 10:41:59 crc kubenswrapper[4992]: E0131 10:41:59.210269 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bee36f0c5c54343dbc6908577be9b8bde2b1e2588658e5f81104d8448562605\": container with ID starting with 6bee36f0c5c54343dbc6908577be9b8bde2b1e2588658e5f81104d8448562605 not found: ID does not exist" containerID="6bee36f0c5c54343dbc6908577be9b8bde2b1e2588658e5f81104d8448562605" Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.210297 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bee36f0c5c54343dbc6908577be9b8bde2b1e2588658e5f81104d8448562605"} err="failed to get container status \"6bee36f0c5c54343dbc6908577be9b8bde2b1e2588658e5f81104d8448562605\": rpc error: code = NotFound desc = could not find container \"6bee36f0c5c54343dbc6908577be9b8bde2b1e2588658e5f81104d8448562605\": container with ID starting with 6bee36f0c5c54343dbc6908577be9b8bde2b1e2588658e5f81104d8448562605 not found: ID does not exist" Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.210314 4992 scope.go:117] "RemoveContainer" containerID="85afdf08731b861bf182298ecff31639e6cf606f29b33686130f38b26f153301" Jan 31 10:41:59 crc kubenswrapper[4992]: E0131 10:41:59.210721 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85afdf08731b861bf182298ecff31639e6cf606f29b33686130f38b26f153301\": container with ID starting with 85afdf08731b861bf182298ecff31639e6cf606f29b33686130f38b26f153301 not found: ID does not exist" containerID="85afdf08731b861bf182298ecff31639e6cf606f29b33686130f38b26f153301" Jan 31 10:41:59 crc kubenswrapper[4992]: I0131 10:41:59.210745 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85afdf08731b861bf182298ecff31639e6cf606f29b33686130f38b26f153301"} err="failed to get container status \"85afdf08731b861bf182298ecff31639e6cf606f29b33686130f38b26f153301\": rpc error: code = NotFound desc = could not find container \"85afdf08731b861bf182298ecff31639e6cf606f29b33686130f38b26f153301\": container with ID starting with 85afdf08731b861bf182298ecff31639e6cf606f29b33686130f38b26f153301 not found: ID does not exist" Jan 31 10:42:07 crc kubenswrapper[4992]: I0131 10:42:07.798478 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6958dd7cf4-b96rn_d2f2b0bd-990b-43a8-93e0-70564d946308/barbican-api/0.log" Jan 31 10:42:07 crc kubenswrapper[4992]: I0131 10:42:07.825515 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ansibletest-ansibletest_6f390e65-a705-4a9b-a0af-3c22d4cd4193/ansibletest-ansibletest/0.log" Jan 31 10:42:07 crc kubenswrapper[4992]: I0131 10:42:07.979117 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6958dd7cf4-b96rn_d2f2b0bd-990b-43a8-93e0-70564d946308/barbican-api-log/0.log" Jan 31 10:42:08 crc kubenswrapper[4992]: I0131 10:42:08.028129 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5c454c9bdb-2cf9g_b2b51516-4104-4383-9312-e813d570ae69/barbican-keystone-listener/0.log" Jan 31 10:42:08 crc kubenswrapper[4992]: I0131 10:42:08.084546 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5c454c9bdb-2cf9g_b2b51516-4104-4383-9312-e813d570ae69/barbican-keystone-listener-log/0.log" Jan 31 10:42:08 crc kubenswrapper[4992]: I0131 10:42:08.230883 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-c6bf9c9b7-9ntlg_e2f5ab71-2acb-484c-b6a8-51e447281183/barbican-worker/0.log" Jan 31 10:42:08 crc kubenswrapper[4992]: I0131 10:42:08.325072 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-c6bf9c9b7-9ntlg_e2f5ab71-2acb-484c-b6a8-51e447281183/barbican-worker-log/0.log" Jan 31 10:42:08 crc kubenswrapper[4992]: I0131 10:42:08.447939 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ksx6p_e9d6e849-d0a1-4943-b626-7c38e8ac6a11/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 10:42:08 crc kubenswrapper[4992]: I0131 10:42:08.499947 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d2cad81f-c9ec-406f-81d1-749646e7e81b/ceilometer-central-agent/0.log" Jan 31 10:42:08 crc kubenswrapper[4992]: I0131 10:42:08.527661 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d2cad81f-c9ec-406f-81d1-749646e7e81b/ceilometer-notification-agent/0.log" Jan 31 10:42:08 crc kubenswrapper[4992]: I0131 10:42:08.706605 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d2cad81f-c9ec-406f-81d1-749646e7e81b/proxy-httpd/0.log" Jan 31 10:42:08 crc kubenswrapper[4992]: I0131 10:42:08.723323 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d2cad81f-c9ec-406f-81d1-749646e7e81b/sg-core/0.log" Jan 31 10:42:08 crc kubenswrapper[4992]: I0131 10:42:08.815631 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-xv9qc_ef20b86c-51f8-49eb-91f8-d9229b97cdbf/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 10:42:09 crc kubenswrapper[4992]: I0131 10:42:09.114770 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jgstv_22c03d00-71b7-4e60-8f46-1373c8cba767/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 10:42:09 crc kubenswrapper[4992]: I0131 10:42:09.227371 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d07041cf-fa25-4d72-9ff6-c756c3ced72f/cinder-api/0.log" Jan 31 10:42:09 crc kubenswrapper[4992]: I0131 10:42:09.295450 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d07041cf-fa25-4d72-9ff6-c756c3ced72f/cinder-api-log/0.log" Jan 31 10:42:09 crc kubenswrapper[4992]: I0131 10:42:09.518382 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_73e60923-2cfb-4f00-adf0-ace27b9623f0/probe/0.log" Jan 31 10:42:09 crc kubenswrapper[4992]: I0131 10:42:09.535322 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_73e60923-2cfb-4f00-adf0-ace27b9623f0/cinder-backup/0.log" Jan 31 10:42:09 crc kubenswrapper[4992]: I0131 10:42:09.671144 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_233c7d9d-d8e2-456d-8382-bbc880debb01/cinder-scheduler/0.log" Jan 31 10:42:09 crc kubenswrapper[4992]: I0131 10:42:09.738455 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_233c7d9d-d8e2-456d-8382-bbc880debb01/probe/0.log" Jan 31 10:42:09 crc kubenswrapper[4992]: I0131 10:42:09.849491 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_4a08173f-07a3-4a7a-b124-b3a98c1d0749/cinder-volume/0.log" Jan 31 10:42:09 crc kubenswrapper[4992]: I0131 10:42:09.923825 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_4a08173f-07a3-4a7a-b124-b3a98c1d0749/probe/0.log" Jan 31 10:42:10 crc kubenswrapper[4992]: I0131 10:42:10.025655 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-547tw_8ba2e7f8-2510-48bd-9597-8f25bde3aed5/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 10:42:10 crc kubenswrapper[4992]: I0131 10:42:10.123502 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kz995_663638e3-cdf5-47cb-9515-0c6e0ae1f11a/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 10:42:10 crc kubenswrapper[4992]: I0131 10:42:10.183016 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:42:10 crc kubenswrapper[4992]: E0131 10:42:10.183348 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:42:10 crc kubenswrapper[4992]: I0131 10:42:10.242907 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-5tqkp_e7aee525-4be4-45af-9c7c-543f25591ff9/init/0.log" Jan 31 10:42:10 crc kubenswrapper[4992]: I0131 10:42:10.456201 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-5tqkp_e7aee525-4be4-45af-9c7c-543f25591ff9/init/0.log" Jan 31 10:42:10 crc kubenswrapper[4992]: I0131 10:42:10.468762 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-69655fd4bf-5tqkp_e7aee525-4be4-45af-9c7c-543f25591ff9/dnsmasq-dns/0.log" Jan 31 10:42:10 crc kubenswrapper[4992]: I0131 10:42:10.504953 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a3c52617-d568-4cd4-8cf1-e5b02737770f/glance-httpd/0.log" Jan 31 10:42:10 crc kubenswrapper[4992]: I0131 10:42:10.658764 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a3c52617-d568-4cd4-8cf1-e5b02737770f/glance-log/0.log" Jan 31 10:42:10 crc kubenswrapper[4992]: I0131 10:42:10.713402 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_682536c3-9edb-474b-9854-de0383d1c7f6/glance-log/0.log" Jan 31 10:42:10 crc kubenswrapper[4992]: I0131 10:42:10.719070 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_682536c3-9edb-474b-9854-de0383d1c7f6/glance-httpd/0.log" Jan 31 10:42:10 crc kubenswrapper[4992]: I0131 10:42:10.975075 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d6fc5dc84-n2kln_7ec5b46d-f009-46c5-a8a5-78b5b3afc50e/horizon/0.log" Jan 31 10:42:11 crc kubenswrapper[4992]: I0131 10:42:11.184982 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizontest-tests-horizontest_8d0adaef-9167-4386-94f2-6e638914a9b5/horizontest-tests-horizontest/0.log" Jan 31 10:42:11 crc kubenswrapper[4992]: I0131 10:42:11.205528 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gzxj9_69ab53db-b628-4853-9f28-c81ab402290b/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 10:42:11 crc kubenswrapper[4992]: I0131 10:42:11.382495 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d6fc5dc84-n2kln_7ec5b46d-f009-46c5-a8a5-78b5b3afc50e/horizon-log/0.log" Jan 31 10:42:11 crc kubenswrapper[4992]: I0131 10:42:11.486181 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-thgvt_1ad2f8cc-171c-43fc-8bf1-18d22dde00c7/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 10:42:11 crc kubenswrapper[4992]: I0131 10:42:11.715558 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29497561-fzlr9_d5371aa1-88ea-42d2-9630-731d03707bb7/keystone-cron/0.log" Jan 31 10:42:11 crc kubenswrapper[4992]: I0131 10:42:11.716596 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-55c8cdc56b-dkph6_e269f327-9779-4b47-ab5b-bfae29d5bcb4/keystone-api/0.log" Jan 31 10:42:11 crc kubenswrapper[4992]: I0131 10:42:11.743335 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_29303fd9-18c6-45cf-8c1b-ec0ce59fcaa0/kube-state-metrics/0.log" Jan 31 10:42:11 crc kubenswrapper[4992]: I0131 10:42:11.899596 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ghcwn_a64d87fc-267b-4505-a807-aa020492685c/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 10:42:12 crc kubenswrapper[4992]: I0131 10:42:12.027665 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_89e9cdf6-7f56-4752-a725-f93bb5f98009/manila-api-log/0.log" Jan 31 10:42:12 crc kubenswrapper[4992]: I0131 10:42:12.034988 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_89e9cdf6-7f56-4752-a725-f93bb5f98009/manila-api/0.log" Jan 31 10:42:12 crc kubenswrapper[4992]: I0131 10:42:12.167567 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_db63f7ec-78d1-4773-a6c3-6b48f02f3017/probe/0.log" Jan 31 10:42:12 crc kubenswrapper[4992]: I0131 10:42:12.182856 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_db63f7ec-78d1-4773-a6c3-6b48f02f3017/manila-scheduler/0.log" Jan 31 10:42:12 crc kubenswrapper[4992]: I0131 10:42:12.277992 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb/manila-share/0.log" Jan 31 10:42:12 crc kubenswrapper[4992]: I0131 10:42:12.349986 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_5055941f-cdac-4ffa-bb6e-ddcb3db4a9bb/probe/0.log" Jan 31 10:42:12 crc kubenswrapper[4992]: I0131 10:42:12.579257 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-98d958769-wtkh9_987e9d9b-684a-4d52-adb0-bf76c86e9999/neutron-api/0.log" Jan 31 10:42:12 crc kubenswrapper[4992]: I0131 10:42:12.583505 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-98d958769-wtkh9_987e9d9b-684a-4d52-adb0-bf76c86e9999/neutron-httpd/0.log" Jan 31 10:42:12 crc kubenswrapper[4992]: I0131 10:42:12.785020 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-cqcqg_ed4f9a40-f9be-433b-990f-8ad0ec1d8c79/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 10:42:13 crc kubenswrapper[4992]: I0131 10:42:13.065697 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ac11de09-452e-4bbb-b8f6-09ea1beea4e0/nova-api-log/0.log" Jan 31 10:42:13 crc kubenswrapper[4992]: I0131 10:42:13.276769 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_7e726eda-44ef-49d4-9bc6-32efa2149de5/nova-cell0-conductor-conductor/0.log" Jan 31 10:42:13 crc kubenswrapper[4992]: I0131 10:42:13.377272 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ac11de09-452e-4bbb-b8f6-09ea1beea4e0/nova-api-api/0.log" Jan 31 10:42:13 crc kubenswrapper[4992]: I0131 10:42:13.554893 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_743e90b7-a043-4412-b05a-d9d36b5e9cf8/nova-cell1-conductor-conductor/0.log" Jan 31 10:42:13 crc kubenswrapper[4992]: I0131 10:42:13.712619 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bcc32173-2a8d-436e-84b1-bc687b7d8e23/nova-cell1-novncproxy-novncproxy/0.log" Jan 31 10:42:13 crc kubenswrapper[4992]: I0131 10:42:13.785714 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tr74z_b8de9d76-0d2b-4006-b646-c6065aafc642/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 10:42:13 crc kubenswrapper[4992]: I0131 10:42:13.917193 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cb802601-7090-4e10-a3e5-3fc64959cbe9/nova-metadata-log/0.log" Jan 31 10:42:14 crc kubenswrapper[4992]: I0131 10:42:14.113601 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_542d01d8-f2f8-4513-857e-cdc828f381f9/nova-scheduler-scheduler/0.log" Jan 31 10:42:14 crc kubenswrapper[4992]: I0131 10:42:14.242238 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_71459842-ede4-4fcc-9e61-d884a02b341e/mysql-bootstrap/0.log" Jan 31 10:42:14 crc kubenswrapper[4992]: I0131 10:42:14.968896 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_71459842-ede4-4fcc-9e61-d884a02b341e/galera/0.log" Jan 31 10:42:15 crc kubenswrapper[4992]: I0131 10:42:15.009715 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_71459842-ede4-4fcc-9e61-d884a02b341e/mysql-bootstrap/0.log" Jan 31 10:42:15 crc kubenswrapper[4992]: I0131 10:42:15.187290 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_29ab568d-b0c6-4420-b5fb-d027c4561e2f/mysql-bootstrap/0.log" Jan 31 10:42:15 crc kubenswrapper[4992]: I0131 10:42:15.428146 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_29ab568d-b0c6-4420-b5fb-d027c4561e2f/mysql-bootstrap/0.log" Jan 31 10:42:15 crc kubenswrapper[4992]: I0131 10:42:15.475642 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cb802601-7090-4e10-a3e5-3fc64959cbe9/nova-metadata-metadata/0.log" Jan 31 10:42:15 crc kubenswrapper[4992]: I0131 10:42:15.503287 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_29ab568d-b0c6-4420-b5fb-d027c4561e2f/galera/0.log" Jan 31 10:42:15 crc kubenswrapper[4992]: I0131 10:42:15.612912 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_34f57f1e-1a2d-40dd-8e48-20a1454f1eca/openstackclient/0.log" Jan 31 10:42:15 crc kubenswrapper[4992]: I0131 10:42:15.726500 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xhfg8_f92074e4-8400-4bd5-9135-4b25d46d4607/openstack-network-exporter/0.log" Jan 31 10:42:15 crc kubenswrapper[4992]: I0131 10:42:15.846826 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l45p8_d5e4e817-6532-4d1d-85f8-649dae6babac/ovsdb-server-init/0.log" Jan 31 10:42:16 crc kubenswrapper[4992]: I0131 10:42:16.094885 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l45p8_d5e4e817-6532-4d1d-85f8-649dae6babac/ovsdb-server-init/0.log" Jan 31 10:42:16 crc kubenswrapper[4992]: I0131 10:42:16.170847 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l45p8_d5e4e817-6532-4d1d-85f8-649dae6babac/ovsdb-server/0.log" Jan 31 10:42:16 crc kubenswrapper[4992]: I0131 10:42:16.172148 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l45p8_d5e4e817-6532-4d1d-85f8-649dae6babac/ovs-vswitchd/0.log" Jan 31 10:42:16 crc kubenswrapper[4992]: I0131 10:42:16.756708 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vxtkq_9aca01f7-a9ff-4d25-a330-c505e93a3cd0/ovn-controller/0.log" Jan 31 10:42:16 crc kubenswrapper[4992]: I0131 10:42:16.870841 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-cbm9r_aac5fc1b-1bde-4799-bf43-efe86969c792/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 10:42:16 crc kubenswrapper[4992]: I0131 10:42:16.961603 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d20275ed-25c4-42ca-9d6e-fc909f6844fb/openstack-network-exporter/0.log" Jan 31 10:42:17 crc kubenswrapper[4992]: I0131 10:42:17.092868 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d20275ed-25c4-42ca-9d6e-fc909f6844fb/ovn-northd/0.log" Jan 31 10:42:17 crc kubenswrapper[4992]: I0131 10:42:17.178909 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a6c51daf-48d3-4c48-b615-83aede3b27fa/openstack-network-exporter/0.log" Jan 31 10:42:17 crc kubenswrapper[4992]: I0131 10:42:17.183815 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a6c51daf-48d3-4c48-b615-83aede3b27fa/ovsdbserver-nb/0.log" Jan 31 10:42:17 crc kubenswrapper[4992]: I0131 10:42:17.389960 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d46da54e-1322-4b24-90f2-0929ae4711c2/openstack-network-exporter/0.log" Jan 31 10:42:17 crc kubenswrapper[4992]: I0131 10:42:17.427709 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d46da54e-1322-4b24-90f2-0929ae4711c2/ovsdbserver-sb/0.log" Jan 31 10:42:17 crc kubenswrapper[4992]: I0131 10:42:17.652724 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7796988564-hnmhv_f09da6f2-c367-45e0-8293-e2b6a9b9df2c/placement-api/0.log" Jan 31 10:42:17 crc kubenswrapper[4992]: I0131 10:42:17.694886 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7796988564-hnmhv_f09da6f2-c367-45e0-8293-e2b6a9b9df2c/placement-log/0.log" Jan 31 10:42:17 crc kubenswrapper[4992]: I0131 10:42:17.735387 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d8d20de0-f97d-4d8a-a01f-01144400f76c/setup-container/0.log" Jan 31 10:42:18 crc kubenswrapper[4992]: I0131 10:42:18.133920 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_27279979-e584-4689-893b-6357ed920fef/setup-container/0.log" Jan 31 10:42:18 crc kubenswrapper[4992]: I0131 10:42:18.159099 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d8d20de0-f97d-4d8a-a01f-01144400f76c/setup-container/0.log" Jan 31 10:42:18 crc kubenswrapper[4992]: I0131 10:42:18.197609 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d8d20de0-f97d-4d8a-a01f-01144400f76c/rabbitmq/0.log" Jan 31 10:42:18 crc kubenswrapper[4992]: I0131 10:42:18.378406 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_27279979-e584-4689-893b-6357ed920fef/setup-container/0.log" Jan 31 10:42:18 crc kubenswrapper[4992]: I0131 10:42:18.385675 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_27279979-e584-4689-893b-6357ed920fef/rabbitmq/0.log" Jan 31 10:42:18 crc kubenswrapper[4992]: I0131 10:42:18.408591 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6pkpv_c3bfab48-5c65-4ed8-8a1b-46b0ecbd1540/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 10:42:18 crc kubenswrapper[4992]: I0131 10:42:18.590979 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zg5fm_a3b485df-8ea6-46a4-8ba0-75bd8139e5b5/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 10:42:18 crc kubenswrapper[4992]: I0131 10:42:18.672041 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-5zmv4_b936aa38-b3af-4639-90d0-d54936217a7e/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 10:42:18 crc kubenswrapper[4992]: I0131 10:42:18.882700 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-7rddg_1a869ced-d71b-45ea-9e5f-f2f83646d603/ssh-known-hosts-edpm-deployment/0.log" Jan 31 10:42:18 crc kubenswrapper[4992]: I0131 10:42:18.943301 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s00-full_aa081a77-baa5-4663-899c-9738d3904244/tempest-tests-tempest-tests-runner/0.log" Jan 31 10:42:19 crc kubenswrapper[4992]: I0131 10:42:19.143786 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest-s01-single-test_b763b768-dbea-43f3-a06b-b773c6332ea5/tempest-tests-tempest-tests-runner/0.log" Jan 31 10:42:19 crc kubenswrapper[4992]: I0131 10:42:19.211599 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-ansibletest-ansibletest-ansibletest_0b361a5e-e5b9-4612-a165-09280c301ac8/test-operator-logs-container/0.log" Jan 31 10:42:19 crc kubenswrapper[4992]: I0131 10:42:19.351554 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-horizontest-horizontest-tests-horizontest_bd7ef245-70f0-447f-b0f5-8e7d7df4325f/test-operator-logs-container/0.log" Jan 31 10:42:19 crc kubenswrapper[4992]: I0131 10:42:19.412020 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_a465f843-8918-48c6-899b-77cc07e022f0/test-operator-logs-container/0.log" Jan 31 10:42:19 crc kubenswrapper[4992]: I0131 10:42:19.603150 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tobiko-tobiko-tests-tobiko_a231b005-beb8-47f3-9dce-53a7459528f5/test-operator-logs-container/0.log" Jan 31 10:42:19 crc kubenswrapper[4992]: I0131 10:42:19.676311 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s00-podified-functional_974de630-b062-43d4-825a-af00b5f4ba2f/tobiko-tests-tobiko/0.log" Jan 31 10:42:19 crc kubenswrapper[4992]: I0131 10:42:19.855290 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tobiko-tests-tobiko-s01-sanity_046d2b54-215c-47f0-82a6-9f8ea63414bf/tobiko-tests-tobiko/0.log" Jan 31 10:42:19 crc kubenswrapper[4992]: I0131 10:42:19.902886 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-dh72r_6b39a385-5883-4055-99cf-3c82edc683d6/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 31 10:42:23 crc kubenswrapper[4992]: I0131 10:42:23.182840 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:42:23 crc kubenswrapper[4992]: E0131 10:42:23.183682 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:42:32 crc kubenswrapper[4992]: I0131 10:42:32.396568 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7e111be8-674c-4a0a-84b1-4a08fa98391f/memcached/0.log" Jan 31 10:42:36 crc kubenswrapper[4992]: I0131 10:42:36.182750 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:42:36 crc kubenswrapper[4992]: E0131 10:42:36.183511 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:42:43 crc kubenswrapper[4992]: E0131 10:42:43.182471 4992 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Jan 31 10:42:48 crc kubenswrapper[4992]: I0131 10:42:48.182656 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:42:48 crc kubenswrapper[4992]: I0131 10:42:48.604835 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"6f7203532d581ffbcf617bfe7c402ecf702d9362eef64405ec48fcd09e2ed88d"} Jan 31 10:42:48 crc kubenswrapper[4992]: I0131 10:42:48.825033 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq_876252a0-4bc3-4deb-808b-16af91439ae7/util/0.log" Jan 31 10:42:49 crc kubenswrapper[4992]: I0131 10:42:49.086254 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq_876252a0-4bc3-4deb-808b-16af91439ae7/pull/0.log" Jan 31 10:42:49 crc kubenswrapper[4992]: I0131 10:42:49.098345 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq_876252a0-4bc3-4deb-808b-16af91439ae7/util/0.log" Jan 31 10:42:49 crc kubenswrapper[4992]: I0131 10:42:49.102631 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq_876252a0-4bc3-4deb-808b-16af91439ae7/pull/0.log" Jan 31 10:42:49 crc kubenswrapper[4992]: I0131 10:42:49.261156 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq_876252a0-4bc3-4deb-808b-16af91439ae7/util/0.log" Jan 31 10:42:49 crc kubenswrapper[4992]: I0131 10:42:49.301849 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq_876252a0-4bc3-4deb-808b-16af91439ae7/pull/0.log" Jan 31 10:42:49 crc kubenswrapper[4992]: I0131 10:42:49.310622 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_74d57d99560c1be31b170c62a124fc67584e582398234262565c181af6f48nq_876252a0-4bc3-4deb-808b-16af91439ae7/extract/0.log" Jan 31 10:42:49 crc kubenswrapper[4992]: I0131 10:42:49.500305 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-xvpg7_93e6253c-64e6-458c-a400-c9587c015da2/manager/0.log" Jan 31 10:42:49 crc kubenswrapper[4992]: I0131 10:42:49.579742 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-fjhnl_3a5af5db-f87a-4df8-bb90-81b56a48ec32/manager/0.log" Jan 31 10:42:49 crc kubenswrapper[4992]: I0131 10:42:49.716378 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-p5bv7_0237347b-28b1-42cb-b0c3-3d0cdb660846/manager/0.log" Jan 31 10:42:49 crc kubenswrapper[4992]: I0131 10:42:49.766319 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-9qxn8_f03e2ba4-3389-499d-866f-8349a5a41b9e/manager/0.log" Jan 31 10:42:49 crc kubenswrapper[4992]: I0131 10:42:49.918724 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-mgfln_429d0324-062a-4afe-88e7-ff0fc725b5fc/manager/0.log" Jan 31 10:42:49 crc kubenswrapper[4992]: I0131 10:42:49.964431 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-jtscq_c696cc9f-2c26-4f46-8a14-fc28c65d1855/manager/0.log" Jan 31 10:42:50 crc kubenswrapper[4992]: I0131 10:42:50.187788 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-fgt9z_8530365a-5ca4-4aae-b0cd-7f2cf9e4fb58/manager/0.log" Jan 31 10:42:50 crc kubenswrapper[4992]: I0131 10:42:50.395885 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-98f7s_d8bbd2ef-463b-430b-8332-a6f48ea54e75/manager/0.log" Jan 31 10:42:51 crc kubenswrapper[4992]: I0131 10:42:51.039980 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-kqs5z_f74c9115-a924-4896-a04d-4f523eadafa7/manager/0.log" Jan 31 10:42:51 crc kubenswrapper[4992]: I0131 10:42:51.057119 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-w2697_99af6e66-0ebf-4bb3-a943-d106e624df65/manager/0.log" Jan 31 10:42:51 crc kubenswrapper[4992]: I0131 10:42:51.158231 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-js4c9_1052ce64-cfa7-4fd8-be68-849dc5cfc74f/manager/0.log" Jan 31 10:42:51 crc kubenswrapper[4992]: I0131 10:42:51.343065 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-pqvpw_56f8199c-cece-4c37-a1ab-87f1eae9bd83/manager/0.log" Jan 31 10:42:51 crc kubenswrapper[4992]: I0131 10:42:51.485824 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-wxxrr_b14e0dd4-9405-4358-802c-631e838b0746/manager/0.log" Jan 31 10:42:51 crc kubenswrapper[4992]: I0131 10:42:51.537197 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-2j9gv_a24dcaba-7ba8-45f1-9d82-6d080be373c8/manager/0.log" Jan 31 10:42:51 crc kubenswrapper[4992]: I0131 10:42:51.685207 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-59c4b45c4dvldj4_185c6250-c2c7-4ff3-bfe6-6449f78269f2/manager/0.log" Jan 31 10:42:51 crc kubenswrapper[4992]: I0131 10:42:51.806038 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-76c496b575-d66w5_8391a985-63ab-4b6a-89e9-9268c8e70e81/operator/0.log" Jan 31 10:42:52 crc kubenswrapper[4992]: I0131 10:42:52.582548 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-jntmw_980f4141-932f-43e9-9519-d1371656816e/registry-server/0.log" Jan 31 10:42:52 crc kubenswrapper[4992]: I0131 10:42:52.635952 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-d56qf_5db9c6d8-aa24-4750-9fa1-6b961574dddb/manager/0.log" Jan 31 10:42:52 crc kubenswrapper[4992]: I0131 10:42:52.887714 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-4jkf5_692b4d55-c298-4489-9381-73e748a0bc5d/manager/0.log" Jan 31 10:42:52 crc kubenswrapper[4992]: I0131 10:42:52.894272 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-nszvc_d8a209a5-ff60-4e77-8745-300b0c1c542a/operator/0.log" Jan 31 10:42:53 crc kubenswrapper[4992]: I0131 10:42:53.030097 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-qg4c6_aefa948a-54b2-4cff-8934-2af2310b73da/manager/0.log" Jan 31 10:42:53 crc kubenswrapper[4992]: I0131 10:42:53.252041 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6944ddd655-mmsrl_7496dda7-bc5b-4f0f-a93a-176f397fbeca/manager/0.log" Jan 31 10:42:53 crc kubenswrapper[4992]: I0131 10:42:53.265964 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-64b5b76f97-khm8v_a0ff4139-6d2d-4a25-b0de-29a4e48e407f/manager/0.log" Jan 31 10:42:53 crc kubenswrapper[4992]: I0131 10:42:53.448533 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-grr4s_029e3183-fa79-4992-a074-61281511f268/manager/0.log" Jan 31 10:42:53 crc kubenswrapper[4992]: I0131 10:42:53.769834 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-b84f98fd-qrd9s_cbf54a49-6a1c-49ae-80f6-5beee6be7377/manager/0.log" Jan 31 10:43:13 crc kubenswrapper[4992]: I0131 10:43:13.815758 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-h8jxh_9fb8fd57-5826-40cd-b62d-2a53e9e0c72c/control-plane-machine-set-operator/0.log" Jan 31 10:43:14 crc kubenswrapper[4992]: I0131 10:43:14.001874 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-glkns_7425a945-4499-4a87-b745-d31e5dbf9d0e/kube-rbac-proxy/0.log" Jan 31 10:43:14 crc kubenswrapper[4992]: I0131 10:43:14.063288 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-glkns_7425a945-4499-4a87-b745-d31e5dbf9d0e/machine-api-operator/0.log" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.306079 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p5626"] Jan 31 10:43:15 crc kubenswrapper[4992]: E0131 10:43:15.306830 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63182ea9-4017-41c5-96ec-08c4f2d7fe0b" containerName="registry-server" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.306845 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="63182ea9-4017-41c5-96ec-08c4f2d7fe0b" containerName="registry-server" Jan 31 10:43:15 crc kubenswrapper[4992]: E0131 10:43:15.306862 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63182ea9-4017-41c5-96ec-08c4f2d7fe0b" containerName="extract-utilities" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.306870 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="63182ea9-4017-41c5-96ec-08c4f2d7fe0b" containerName="extract-utilities" Jan 31 10:43:15 crc kubenswrapper[4992]: E0131 10:43:15.306902 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63182ea9-4017-41c5-96ec-08c4f2d7fe0b" containerName="extract-content" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.306912 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="63182ea9-4017-41c5-96ec-08c4f2d7fe0b" containerName="extract-content" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.307155 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="63182ea9-4017-41c5-96ec-08c4f2d7fe0b" containerName="registry-server" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.308902 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.325152 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p5626"] Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.382916 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpgkr\" (UniqueName: \"kubernetes.io/projected/b4ffcb92-077d-48ed-9979-7d0393368144-kube-api-access-fpgkr\") pod \"certified-operators-p5626\" (UID: \"b4ffcb92-077d-48ed-9979-7d0393368144\") " pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.382972 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4ffcb92-077d-48ed-9979-7d0393368144-catalog-content\") pod \"certified-operators-p5626\" (UID: \"b4ffcb92-077d-48ed-9979-7d0393368144\") " pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.383079 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4ffcb92-077d-48ed-9979-7d0393368144-utilities\") pod \"certified-operators-p5626\" (UID: \"b4ffcb92-077d-48ed-9979-7d0393368144\") " pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.484451 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpgkr\" (UniqueName: \"kubernetes.io/projected/b4ffcb92-077d-48ed-9979-7d0393368144-kube-api-access-fpgkr\") pod \"certified-operators-p5626\" (UID: \"b4ffcb92-077d-48ed-9979-7d0393368144\") " pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.484506 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4ffcb92-077d-48ed-9979-7d0393368144-catalog-content\") pod \"certified-operators-p5626\" (UID: \"b4ffcb92-077d-48ed-9979-7d0393368144\") " pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.484568 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4ffcb92-077d-48ed-9979-7d0393368144-utilities\") pod \"certified-operators-p5626\" (UID: \"b4ffcb92-077d-48ed-9979-7d0393368144\") " pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.485095 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4ffcb92-077d-48ed-9979-7d0393368144-catalog-content\") pod \"certified-operators-p5626\" (UID: \"b4ffcb92-077d-48ed-9979-7d0393368144\") " pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.485103 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4ffcb92-077d-48ed-9979-7d0393368144-utilities\") pod \"certified-operators-p5626\" (UID: \"b4ffcb92-077d-48ed-9979-7d0393368144\") " pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.503443 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpgkr\" (UniqueName: \"kubernetes.io/projected/b4ffcb92-077d-48ed-9979-7d0393368144-kube-api-access-fpgkr\") pod \"certified-operators-p5626\" (UID: \"b4ffcb92-077d-48ed-9979-7d0393368144\") " pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:15 crc kubenswrapper[4992]: I0131 10:43:15.630874 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:16 crc kubenswrapper[4992]: I0131 10:43:16.169880 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p5626"] Jan 31 10:43:16 crc kubenswrapper[4992]: I0131 10:43:16.876999 4992 generic.go:334] "Generic (PLEG): container finished" podID="b4ffcb92-077d-48ed-9979-7d0393368144" containerID="dddfa009bce0ef68d433519e5abae570bfe224794edebeb69fec1cbdf45f1aae" exitCode=0 Jan 31 10:43:16 crc kubenswrapper[4992]: I0131 10:43:16.877158 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5626" event={"ID":"b4ffcb92-077d-48ed-9979-7d0393368144","Type":"ContainerDied","Data":"dddfa009bce0ef68d433519e5abae570bfe224794edebeb69fec1cbdf45f1aae"} Jan 31 10:43:16 crc kubenswrapper[4992]: I0131 10:43:16.877412 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5626" event={"ID":"b4ffcb92-077d-48ed-9979-7d0393368144","Type":"ContainerStarted","Data":"3a5957d5071aa3627bff9745ce7b47717ccc07ff63fbda70b2a72cfa054c97d6"} Jan 31 10:43:17 crc kubenswrapper[4992]: I0131 10:43:17.887960 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5626" event={"ID":"b4ffcb92-077d-48ed-9979-7d0393368144","Type":"ContainerStarted","Data":"c0e9c1c3532ce65e71bee6f2efbd0586622979be65c7e99f7b52dfa069490837"} Jan 31 10:43:19 crc kubenswrapper[4992]: I0131 10:43:19.911937 4992 generic.go:334] "Generic (PLEG): container finished" podID="b4ffcb92-077d-48ed-9979-7d0393368144" containerID="c0e9c1c3532ce65e71bee6f2efbd0586622979be65c7e99f7b52dfa069490837" exitCode=0 Jan 31 10:43:19 crc kubenswrapper[4992]: I0131 10:43:19.912060 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5626" event={"ID":"b4ffcb92-077d-48ed-9979-7d0393368144","Type":"ContainerDied","Data":"c0e9c1c3532ce65e71bee6f2efbd0586622979be65c7e99f7b52dfa069490837"} Jan 31 10:43:20 crc kubenswrapper[4992]: I0131 10:43:20.921771 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5626" event={"ID":"b4ffcb92-077d-48ed-9979-7d0393368144","Type":"ContainerStarted","Data":"3dfda15253de2f820216ac76d96e756dee53f6432c03e1381cc37eec5db8add2"} Jan 31 10:43:20 crc kubenswrapper[4992]: I0131 10:43:20.946749 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p5626" podStartSLOduration=2.514250281 podStartE2EDuration="5.946731617s" podCreationTimestamp="2026-01-31 10:43:15 +0000 UTC" firstStartedPulling="2026-01-31 10:43:16.87975647 +0000 UTC m=+4692.851148467" lastFinishedPulling="2026-01-31 10:43:20.312237816 +0000 UTC m=+4696.283629803" observedRunningTime="2026-01-31 10:43:20.940397567 +0000 UTC m=+4696.911789574" watchObservedRunningTime="2026-01-31 10:43:20.946731617 +0000 UTC m=+4696.918123604" Jan 31 10:43:25 crc kubenswrapper[4992]: I0131 10:43:25.631750 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:25 crc kubenswrapper[4992]: I0131 10:43:25.633030 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:25 crc kubenswrapper[4992]: I0131 10:43:25.678324 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:26 crc kubenswrapper[4992]: I0131 10:43:26.018314 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:26 crc kubenswrapper[4992]: I0131 10:43:26.087458 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p5626"] Jan 31 10:43:27 crc kubenswrapper[4992]: I0131 10:43:27.989015 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p5626" podUID="b4ffcb92-077d-48ed-9979-7d0393368144" containerName="registry-server" containerID="cri-o://3dfda15253de2f820216ac76d96e756dee53f6432c03e1381cc37eec5db8add2" gracePeriod=2 Jan 31 10:43:28 crc kubenswrapper[4992]: I0131 10:43:28.918453 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-ckw79_4a7adcb8-e515-4542-be62-9dbc5de42601/cert-manager-controller/0.log" Jan 31 10:43:28 crc kubenswrapper[4992]: I0131 10:43:28.967791 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.001229 4992 generic.go:334] "Generic (PLEG): container finished" podID="b4ffcb92-077d-48ed-9979-7d0393368144" containerID="3dfda15253de2f820216ac76d96e756dee53f6432c03e1381cc37eec5db8add2" exitCode=0 Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.001270 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5626" event={"ID":"b4ffcb92-077d-48ed-9979-7d0393368144","Type":"ContainerDied","Data":"3dfda15253de2f820216ac76d96e756dee53f6432c03e1381cc37eec5db8add2"} Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.001301 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5626" event={"ID":"b4ffcb92-077d-48ed-9979-7d0393368144","Type":"ContainerDied","Data":"3a5957d5071aa3627bff9745ce7b47717ccc07ff63fbda70b2a72cfa054c97d6"} Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.001304 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5626" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.001316 4992 scope.go:117] "RemoveContainer" containerID="3dfda15253de2f820216ac76d96e756dee53f6432c03e1381cc37eec5db8add2" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.021119 4992 scope.go:117] "RemoveContainer" containerID="c0e9c1c3532ce65e71bee6f2efbd0586622979be65c7e99f7b52dfa069490837" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.049922 4992 scope.go:117] "RemoveContainer" containerID="dddfa009bce0ef68d433519e5abae570bfe224794edebeb69fec1cbdf45f1aae" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.093549 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4ffcb92-077d-48ed-9979-7d0393368144-catalog-content\") pod \"b4ffcb92-077d-48ed-9979-7d0393368144\" (UID: \"b4ffcb92-077d-48ed-9979-7d0393368144\") " Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.093735 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4ffcb92-077d-48ed-9979-7d0393368144-utilities\") pod \"b4ffcb92-077d-48ed-9979-7d0393368144\" (UID: \"b4ffcb92-077d-48ed-9979-7d0393368144\") " Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.093793 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpgkr\" (UniqueName: \"kubernetes.io/projected/b4ffcb92-077d-48ed-9979-7d0393368144-kube-api-access-fpgkr\") pod \"b4ffcb92-077d-48ed-9979-7d0393368144\" (UID: \"b4ffcb92-077d-48ed-9979-7d0393368144\") " Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.094813 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ffcb92-077d-48ed-9979-7d0393368144-utilities" (OuterVolumeSpecName: "utilities") pod "b4ffcb92-077d-48ed-9979-7d0393368144" (UID: "b4ffcb92-077d-48ed-9979-7d0393368144"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.095556 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4ffcb92-077d-48ed-9979-7d0393368144-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.100000 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ffcb92-077d-48ed-9979-7d0393368144-kube-api-access-fpgkr" (OuterVolumeSpecName: "kube-api-access-fpgkr") pod "b4ffcb92-077d-48ed-9979-7d0393368144" (UID: "b4ffcb92-077d-48ed-9979-7d0393368144"). InnerVolumeSpecName "kube-api-access-fpgkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.104431 4992 scope.go:117] "RemoveContainer" containerID="3dfda15253de2f820216ac76d96e756dee53f6432c03e1381cc37eec5db8add2" Jan 31 10:43:29 crc kubenswrapper[4992]: E0131 10:43:29.104928 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dfda15253de2f820216ac76d96e756dee53f6432c03e1381cc37eec5db8add2\": container with ID starting with 3dfda15253de2f820216ac76d96e756dee53f6432c03e1381cc37eec5db8add2 not found: ID does not exist" containerID="3dfda15253de2f820216ac76d96e756dee53f6432c03e1381cc37eec5db8add2" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.104969 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dfda15253de2f820216ac76d96e756dee53f6432c03e1381cc37eec5db8add2"} err="failed to get container status \"3dfda15253de2f820216ac76d96e756dee53f6432c03e1381cc37eec5db8add2\": rpc error: code = NotFound desc = could not find container \"3dfda15253de2f820216ac76d96e756dee53f6432c03e1381cc37eec5db8add2\": container with ID starting with 3dfda15253de2f820216ac76d96e756dee53f6432c03e1381cc37eec5db8add2 not found: ID does not exist" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.104996 4992 scope.go:117] "RemoveContainer" containerID="c0e9c1c3532ce65e71bee6f2efbd0586622979be65c7e99f7b52dfa069490837" Jan 31 10:43:29 crc kubenswrapper[4992]: E0131 10:43:29.105308 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e9c1c3532ce65e71bee6f2efbd0586622979be65c7e99f7b52dfa069490837\": container with ID starting with c0e9c1c3532ce65e71bee6f2efbd0586622979be65c7e99f7b52dfa069490837 not found: ID does not exist" containerID="c0e9c1c3532ce65e71bee6f2efbd0586622979be65c7e99f7b52dfa069490837" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.105348 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e9c1c3532ce65e71bee6f2efbd0586622979be65c7e99f7b52dfa069490837"} err="failed to get container status \"c0e9c1c3532ce65e71bee6f2efbd0586622979be65c7e99f7b52dfa069490837\": rpc error: code = NotFound desc = could not find container \"c0e9c1c3532ce65e71bee6f2efbd0586622979be65c7e99f7b52dfa069490837\": container with ID starting with c0e9c1c3532ce65e71bee6f2efbd0586622979be65c7e99f7b52dfa069490837 not found: ID does not exist" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.105372 4992 scope.go:117] "RemoveContainer" containerID="dddfa009bce0ef68d433519e5abae570bfe224794edebeb69fec1cbdf45f1aae" Jan 31 10:43:29 crc kubenswrapper[4992]: E0131 10:43:29.105696 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dddfa009bce0ef68d433519e5abae570bfe224794edebeb69fec1cbdf45f1aae\": container with ID starting with dddfa009bce0ef68d433519e5abae570bfe224794edebeb69fec1cbdf45f1aae not found: ID does not exist" containerID="dddfa009bce0ef68d433519e5abae570bfe224794edebeb69fec1cbdf45f1aae" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.105717 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dddfa009bce0ef68d433519e5abae570bfe224794edebeb69fec1cbdf45f1aae"} err="failed to get container status \"dddfa009bce0ef68d433519e5abae570bfe224794edebeb69fec1cbdf45f1aae\": rpc error: code = NotFound desc = could not find container \"dddfa009bce0ef68d433519e5abae570bfe224794edebeb69fec1cbdf45f1aae\": container with ID starting with dddfa009bce0ef68d433519e5abae570bfe224794edebeb69fec1cbdf45f1aae not found: ID does not exist" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.152217 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ffcb92-077d-48ed-9979-7d0393368144-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4ffcb92-077d-48ed-9979-7d0393368144" (UID: "b4ffcb92-077d-48ed-9979-7d0393368144"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.192747 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-69l5h_6804c6e1-c251-40e6-aef2-7c5034b576aa/cert-manager-webhook/0.log" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.197393 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4ffcb92-077d-48ed-9979-7d0393368144-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.197449 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpgkr\" (UniqueName: \"kubernetes.io/projected/b4ffcb92-077d-48ed-9979-7d0393368144-kube-api-access-fpgkr\") on node \"crc\" DevicePath \"\"" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.235321 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-7lhxv_e16aae74-abb9-4397-a7f7-a4ce1e5d88ae/cert-manager-cainjector/0.log" Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.321146 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p5626"] Jan 31 10:43:29 crc kubenswrapper[4992]: I0131 10:43:29.329649 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p5626"] Jan 31 10:43:31 crc kubenswrapper[4992]: I0131 10:43:31.192839 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4ffcb92-077d-48ed-9979-7d0393368144" path="/var/lib/kubelet/pods/b4ffcb92-077d-48ed-9979-7d0393368144/volumes" Jan 31 10:43:43 crc kubenswrapper[4992]: I0131 10:43:43.585581 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-d4xvw_ae4cff7e-7c67-4840-8b78-ca21eb4e1abf/nmstate-console-plugin/0.log" Jan 31 10:43:43 crc kubenswrapper[4992]: I0131 10:43:43.809688 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-v6rfm_90afa0e0-1dc5-441a-a8f9-5f26b53ebe34/nmstate-handler/0.log" Jan 31 10:43:43 crc kubenswrapper[4992]: I0131 10:43:43.862586 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-sd59f_9e2f9f80-b7a6-4a51-b481-723b3b0daad7/kube-rbac-proxy/0.log" Jan 31 10:43:43 crc kubenswrapper[4992]: I0131 10:43:43.955318 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-sd59f_9e2f9f80-b7a6-4a51-b481-723b3b0daad7/nmstate-metrics/0.log" Jan 31 10:43:44 crc kubenswrapper[4992]: I0131 10:43:44.030979 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-62m4h_7b9dd266-01bc-4f6c-8f1e-a2e0711081fc/nmstate-operator/0.log" Jan 31 10:43:44 crc kubenswrapper[4992]: I0131 10:43:44.150435 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-fhb87_aa44611e-4b2f-4d88-bc6f-04146843bbae/nmstate-webhook/0.log" Jan 31 10:43:52 crc kubenswrapper[4992]: E0131 10:43:52.182698 4992 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Jan 31 10:44:15 crc kubenswrapper[4992]: I0131 10:44:15.760597 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-f4hpb_a5c71871-7487-4d31-8967-2032f4048162/kube-rbac-proxy/0.log" Jan 31 10:44:15 crc kubenswrapper[4992]: I0131 10:44:15.821283 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-f4hpb_a5c71871-7487-4d31-8967-2032f4048162/controller/0.log" Jan 31 10:44:15 crc kubenswrapper[4992]: I0131 10:44:15.956794 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/cp-frr-files/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.115789 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/cp-frr-files/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.116957 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/cp-reloader/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.146298 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/cp-metrics/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.189086 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/cp-reloader/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.360769 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/cp-reloader/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.373325 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/cp-metrics/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.385979 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/cp-frr-files/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.432809 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/cp-metrics/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.594800 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/cp-reloader/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.605759 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/controller/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.628251 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/cp-frr-files/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.660074 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/cp-metrics/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.764739 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/frr-metrics/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.793831 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/kube-rbac-proxy/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.851155 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/kube-rbac-proxy-frr/0.log" Jan 31 10:44:16 crc kubenswrapper[4992]: I0131 10:44:16.956940 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/reloader/0.log" Jan 31 10:44:17 crc kubenswrapper[4992]: I0131 10:44:17.121961 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-bb8hh_9704b2d7-bfa6-40c7-a00f-bb4022274a73/frr-k8s-webhook-server/0.log" Jan 31 10:44:17 crc kubenswrapper[4992]: I0131 10:44:17.691878 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7f5cfb8dfd-ks4dr_06319871-3a38-41ed-966a-4fb2fc393b6e/manager/0.log" Jan 31 10:44:17 crc kubenswrapper[4992]: I0131 10:44:17.864606 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6ff564f44c-br64f_41cfef2d-5a07-47e2-88d5-62a2f468029e/webhook-server/0.log" Jan 31 10:44:17 crc kubenswrapper[4992]: I0131 10:44:17.909956 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7m6xt_fa9e7b4e-012f-446c-b156-cae8b53ba319/kube-rbac-proxy/0.log" Jan 31 10:44:18 crc kubenswrapper[4992]: I0131 10:44:18.417769 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-khsdm_57b24089-856b-4e9c-bdf8-9f0277de0ae4/frr/0.log" Jan 31 10:44:18 crc kubenswrapper[4992]: I0131 10:44:18.475913 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7m6xt_fa9e7b4e-012f-446c-b156-cae8b53ba319/speaker/0.log" Jan 31 10:44:33 crc kubenswrapper[4992]: I0131 10:44:33.534272 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p_9f78434a-da65-45e0-ae70-54b461c9e408/util/0.log" Jan 31 10:44:33 crc kubenswrapper[4992]: I0131 10:44:33.706997 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p_9f78434a-da65-45e0-ae70-54b461c9e408/pull/0.log" Jan 31 10:44:33 crc kubenswrapper[4992]: I0131 10:44:33.709382 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p_9f78434a-da65-45e0-ae70-54b461c9e408/util/0.log" Jan 31 10:44:33 crc kubenswrapper[4992]: I0131 10:44:33.719894 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p_9f78434a-da65-45e0-ae70-54b461c9e408/pull/0.log" Jan 31 10:44:33 crc kubenswrapper[4992]: I0131 10:44:33.909622 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p_9f78434a-da65-45e0-ae70-54b461c9e408/pull/0.log" Jan 31 10:44:33 crc kubenswrapper[4992]: I0131 10:44:33.930868 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p_9f78434a-da65-45e0-ae70-54b461c9e408/extract/0.log" Jan 31 10:44:33 crc kubenswrapper[4992]: I0131 10:44:33.961991 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dczd54p_9f78434a-da65-45e0-ae70-54b461c9e408/util/0.log" Jan 31 10:44:34 crc kubenswrapper[4992]: I0131 10:44:34.117822 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl_c6db0abb-11b5-47ac-a974-497ff05312b8/util/0.log" Jan 31 10:44:34 crc kubenswrapper[4992]: I0131 10:44:34.295949 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl_c6db0abb-11b5-47ac-a974-497ff05312b8/pull/0.log" Jan 31 10:44:34 crc kubenswrapper[4992]: I0131 10:44:34.303483 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl_c6db0abb-11b5-47ac-a974-497ff05312b8/pull/0.log" Jan 31 10:44:34 crc kubenswrapper[4992]: I0131 10:44:34.463089 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl_c6db0abb-11b5-47ac-a974-497ff05312b8/util/0.log" Jan 31 10:44:34 crc kubenswrapper[4992]: I0131 10:44:34.501413 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl_c6db0abb-11b5-47ac-a974-497ff05312b8/util/0.log" Jan 31 10:44:34 crc kubenswrapper[4992]: I0131 10:44:34.502156 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl_c6db0abb-11b5-47ac-a974-497ff05312b8/pull/0.log" Jan 31 10:44:34 crc kubenswrapper[4992]: I0131 10:44:34.566420 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713c2qvl_c6db0abb-11b5-47ac-a974-497ff05312b8/extract/0.log" Jan 31 10:44:34 crc kubenswrapper[4992]: I0131 10:44:34.678130 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v7k4c_26e0ae4d-f1ed-4368-a7b9-b2273ab80827/extract-utilities/0.log" Jan 31 10:44:34 crc kubenswrapper[4992]: I0131 10:44:34.853489 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v7k4c_26e0ae4d-f1ed-4368-a7b9-b2273ab80827/extract-utilities/0.log" Jan 31 10:44:34 crc kubenswrapper[4992]: I0131 10:44:34.877942 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v7k4c_26e0ae4d-f1ed-4368-a7b9-b2273ab80827/extract-content/0.log" Jan 31 10:44:34 crc kubenswrapper[4992]: I0131 10:44:34.906797 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v7k4c_26e0ae4d-f1ed-4368-a7b9-b2273ab80827/extract-content/0.log" Jan 31 10:44:35 crc kubenswrapper[4992]: I0131 10:44:35.074962 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v7k4c_26e0ae4d-f1ed-4368-a7b9-b2273ab80827/extract-utilities/0.log" Jan 31 10:44:35 crc kubenswrapper[4992]: I0131 10:44:35.077637 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v7k4c_26e0ae4d-f1ed-4368-a7b9-b2273ab80827/extract-content/0.log" Jan 31 10:44:35 crc kubenswrapper[4992]: I0131 10:44:35.293253 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fgmqt_ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977/extract-utilities/0.log" Jan 31 10:44:35 crc kubenswrapper[4992]: I0131 10:44:35.464917 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fgmqt_ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977/extract-utilities/0.log" Jan 31 10:44:35 crc kubenswrapper[4992]: I0131 10:44:35.465450 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fgmqt_ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977/extract-content/0.log" Jan 31 10:44:35 crc kubenswrapper[4992]: I0131 10:44:35.526551 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fgmqt_ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977/extract-content/0.log" Jan 31 10:44:35 crc kubenswrapper[4992]: I0131 10:44:35.618257 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-v7k4c_26e0ae4d-f1ed-4368-a7b9-b2273ab80827/registry-server/0.log" Jan 31 10:44:35 crc kubenswrapper[4992]: I0131 10:44:35.745349 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fgmqt_ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977/extract-utilities/0.log" Jan 31 10:44:35 crc kubenswrapper[4992]: I0131 10:44:35.777507 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fgmqt_ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977/extract-content/0.log" Jan 31 10:44:35 crc kubenswrapper[4992]: I0131 10:44:35.947430 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4rg97_854b03be-b8cf-4b7a-91bd-9a7c0c8342ca/marketplace-operator/0.log" Jan 31 10:44:36 crc kubenswrapper[4992]: I0131 10:44:36.122491 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8zs9s_265a14af-f30c-46a1-9618-e3b1e406f841/extract-utilities/0.log" Jan 31 10:44:36 crc kubenswrapper[4992]: I0131 10:44:36.290272 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8zs9s_265a14af-f30c-46a1-9618-e3b1e406f841/extract-utilities/0.log" Jan 31 10:44:36 crc kubenswrapper[4992]: I0131 10:44:36.345118 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8zs9s_265a14af-f30c-46a1-9618-e3b1e406f841/extract-content/0.log" Jan 31 10:44:36 crc kubenswrapper[4992]: I0131 10:44:36.347908 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8zs9s_265a14af-f30c-46a1-9618-e3b1e406f841/extract-content/0.log" Jan 31 10:44:36 crc kubenswrapper[4992]: I0131 10:44:36.472570 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fgmqt_ecb8ad5d-71b3-4c8f-adc9-c0b3bf227977/registry-server/0.log" Jan 31 10:44:36 crc kubenswrapper[4992]: I0131 10:44:36.534666 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8zs9s_265a14af-f30c-46a1-9618-e3b1e406f841/extract-utilities/0.log" Jan 31 10:44:36 crc kubenswrapper[4992]: I0131 10:44:36.539031 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8zs9s_265a14af-f30c-46a1-9618-e3b1e406f841/extract-content/0.log" Jan 31 10:44:36 crc kubenswrapper[4992]: I0131 10:44:36.703318 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8zs9s_265a14af-f30c-46a1-9618-e3b1e406f841/registry-server/0.log" Jan 31 10:44:36 crc kubenswrapper[4992]: I0131 10:44:36.723515 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gp6tz_de53be86-6678-489f-9308-9379267f3295/extract-utilities/0.log" Jan 31 10:44:36 crc kubenswrapper[4992]: I0131 10:44:36.950663 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gp6tz_de53be86-6678-489f-9308-9379267f3295/extract-content/0.log" Jan 31 10:44:36 crc kubenswrapper[4992]: I0131 10:44:36.968654 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gp6tz_de53be86-6678-489f-9308-9379267f3295/extract-utilities/0.log" Jan 31 10:44:36 crc kubenswrapper[4992]: I0131 10:44:36.978860 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gp6tz_de53be86-6678-489f-9308-9379267f3295/extract-content/0.log" Jan 31 10:44:37 crc kubenswrapper[4992]: I0131 10:44:37.169143 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gp6tz_de53be86-6678-489f-9308-9379267f3295/extract-utilities/0.log" Jan 31 10:44:37 crc kubenswrapper[4992]: I0131 10:44:37.179745 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gp6tz_de53be86-6678-489f-9308-9379267f3295/extract-content/0.log" Jan 31 10:44:37 crc kubenswrapper[4992]: I0131 10:44:37.784207 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gp6tz_de53be86-6678-489f-9308-9379267f3295/registry-server/0.log" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.141031 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f"] Jan 31 10:45:00 crc kubenswrapper[4992]: E0131 10:45:00.141972 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ffcb92-077d-48ed-9979-7d0393368144" containerName="extract-utilities" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.141986 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ffcb92-077d-48ed-9979-7d0393368144" containerName="extract-utilities" Jan 31 10:45:00 crc kubenswrapper[4992]: E0131 10:45:00.141995 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ffcb92-077d-48ed-9979-7d0393368144" containerName="extract-content" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.142001 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ffcb92-077d-48ed-9979-7d0393368144" containerName="extract-content" Jan 31 10:45:00 crc kubenswrapper[4992]: E0131 10:45:00.142025 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ffcb92-077d-48ed-9979-7d0393368144" containerName="registry-server" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.142031 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ffcb92-077d-48ed-9979-7d0393368144" containerName="registry-server" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.142217 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ffcb92-077d-48ed-9979-7d0393368144" containerName="registry-server" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.142848 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.145909 4992 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.145916 4992 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.194918 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f"] Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.259387 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-secret-volume\") pod \"collect-profiles-29497605-mvk6f\" (UID: \"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.259464 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgsh6\" (UniqueName: \"kubernetes.io/projected/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-kube-api-access-lgsh6\") pod \"collect-profiles-29497605-mvk6f\" (UID: \"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.259753 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-config-volume\") pod \"collect-profiles-29497605-mvk6f\" (UID: \"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.361621 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-config-volume\") pod \"collect-profiles-29497605-mvk6f\" (UID: \"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.361878 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-secret-volume\") pod \"collect-profiles-29497605-mvk6f\" (UID: \"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.361903 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgsh6\" (UniqueName: \"kubernetes.io/projected/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-kube-api-access-lgsh6\") pod \"collect-profiles-29497605-mvk6f\" (UID: \"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.363690 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-config-volume\") pod \"collect-profiles-29497605-mvk6f\" (UID: \"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.382079 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-secret-volume\") pod \"collect-profiles-29497605-mvk6f\" (UID: \"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.397278 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgsh6\" (UniqueName: \"kubernetes.io/projected/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-kube-api-access-lgsh6\") pod \"collect-profiles-29497605-mvk6f\" (UID: \"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" Jan 31 10:45:00 crc kubenswrapper[4992]: I0131 10:45:00.466406 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" Jan 31 10:45:01 crc kubenswrapper[4992]: I0131 10:45:01.026519 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f"] Jan 31 10:45:01 crc kubenswrapper[4992]: I0131 10:45:01.923308 4992 generic.go:334] "Generic (PLEG): container finished" podID="7a89bdcc-20b7-4fa8-bba3-8c69f5af8996" containerID="a1706a493501619cc415e683b56b2f0f626f814496d58a412b5d953912d75129" exitCode=0 Jan 31 10:45:01 crc kubenswrapper[4992]: I0131 10:45:01.923403 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" event={"ID":"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996","Type":"ContainerDied","Data":"a1706a493501619cc415e683b56b2f0f626f814496d58a412b5d953912d75129"} Jan 31 10:45:01 crc kubenswrapper[4992]: I0131 10:45:01.923819 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" event={"ID":"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996","Type":"ContainerStarted","Data":"f01fb85990b1ab5db8d0fda7e8d14781beeb1814d0d821fe7d378f8190f2ffdb"} Jan 31 10:45:03 crc kubenswrapper[4992]: I0131 10:45:03.370410 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" Jan 31 10:45:03 crc kubenswrapper[4992]: I0131 10:45:03.534493 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-secret-volume\") pod \"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996\" (UID: \"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996\") " Jan 31 10:45:03 crc kubenswrapper[4992]: I0131 10:45:03.535099 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgsh6\" (UniqueName: \"kubernetes.io/projected/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-kube-api-access-lgsh6\") pod \"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996\" (UID: \"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996\") " Jan 31 10:45:03 crc kubenswrapper[4992]: I0131 10:45:03.535188 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-config-volume\") pod \"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996\" (UID: \"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996\") " Jan 31 10:45:03 crc kubenswrapper[4992]: I0131 10:45:03.535775 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-config-volume" (OuterVolumeSpecName: "config-volume") pod "7a89bdcc-20b7-4fa8-bba3-8c69f5af8996" (UID: "7a89bdcc-20b7-4fa8-bba3-8c69f5af8996"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 10:45:03 crc kubenswrapper[4992]: I0131 10:45:03.541131 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7a89bdcc-20b7-4fa8-bba3-8c69f5af8996" (UID: "7a89bdcc-20b7-4fa8-bba3-8c69f5af8996"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 10:45:03 crc kubenswrapper[4992]: I0131 10:45:03.541319 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-kube-api-access-lgsh6" (OuterVolumeSpecName: "kube-api-access-lgsh6") pod "7a89bdcc-20b7-4fa8-bba3-8c69f5af8996" (UID: "7a89bdcc-20b7-4fa8-bba3-8c69f5af8996"). InnerVolumeSpecName "kube-api-access-lgsh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:45:03 crc kubenswrapper[4992]: I0131 10:45:03.638249 4992 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 10:45:03 crc kubenswrapper[4992]: I0131 10:45:03.638284 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgsh6\" (UniqueName: \"kubernetes.io/projected/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-kube-api-access-lgsh6\") on node \"crc\" DevicePath \"\"" Jan 31 10:45:03 crc kubenswrapper[4992]: I0131 10:45:03.638295 4992 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a89bdcc-20b7-4fa8-bba3-8c69f5af8996-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 10:45:03 crc kubenswrapper[4992]: I0131 10:45:03.941478 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" event={"ID":"7a89bdcc-20b7-4fa8-bba3-8c69f5af8996","Type":"ContainerDied","Data":"f01fb85990b1ab5db8d0fda7e8d14781beeb1814d0d821fe7d378f8190f2ffdb"} Jan 31 10:45:03 crc kubenswrapper[4992]: I0131 10:45:03.941521 4992 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f01fb85990b1ab5db8d0fda7e8d14781beeb1814d0d821fe7d378f8190f2ffdb" Jan 31 10:45:03 crc kubenswrapper[4992]: I0131 10:45:03.941574 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497605-mvk6f" Jan 31 10:45:04 crc kubenswrapper[4992]: E0131 10:45:04.182915 4992 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Jan 31 10:45:04 crc kubenswrapper[4992]: I0131 10:45:04.451017 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t"] Jan 31 10:45:04 crc kubenswrapper[4992]: I0131 10:45:04.462773 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497560-hgm7t"] Jan 31 10:45:05 crc kubenswrapper[4992]: I0131 10:45:05.195395 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a15afd-fa57-4e10-acd5-ded126489dd8" path="/var/lib/kubelet/pods/51a15afd-fa57-4e10-acd5-ded126489dd8/volumes" Jan 31 10:45:12 crc kubenswrapper[4992]: E0131 10:45:12.453084 4992 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.243:38980->38.129.56.243:43563: write tcp 38.129.56.243:38980->38.129.56.243:43563: write: broken pipe Jan 31 10:45:15 crc kubenswrapper[4992]: I0131 10:45:15.301480 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:45:15 crc kubenswrapper[4992]: I0131 10:45:15.302094 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:45:20 crc kubenswrapper[4992]: I0131 10:45:20.431101 4992 scope.go:117] "RemoveContainer" containerID="2a93bdb077456c4713694d19462e24a2869278dfa0d950c21e387020802079e1" Jan 31 10:45:45 crc kubenswrapper[4992]: I0131 10:45:45.301528 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:45:45 crc kubenswrapper[4992]: I0131 10:45:45.302326 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:46:10 crc kubenswrapper[4992]: E0131 10:46:10.182978 4992 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Jan 31 10:46:15 crc kubenswrapper[4992]: I0131 10:46:15.301289 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:46:15 crc kubenswrapper[4992]: I0131 10:46:15.302031 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:46:15 crc kubenswrapper[4992]: I0131 10:46:15.302115 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 10:46:15 crc kubenswrapper[4992]: I0131 10:46:15.303385 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6f7203532d581ffbcf617bfe7c402ecf702d9362eef64405ec48fcd09e2ed88d"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 10:46:15 crc kubenswrapper[4992]: I0131 10:46:15.303579 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://6f7203532d581ffbcf617bfe7c402ecf702d9362eef64405ec48fcd09e2ed88d" gracePeriod=600 Jan 31 10:46:15 crc kubenswrapper[4992]: I0131 10:46:15.762295 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="6f7203532d581ffbcf617bfe7c402ecf702d9362eef64405ec48fcd09e2ed88d" exitCode=0 Jan 31 10:46:15 crc kubenswrapper[4992]: I0131 10:46:15.762389 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"6f7203532d581ffbcf617bfe7c402ecf702d9362eef64405ec48fcd09e2ed88d"} Jan 31 10:46:15 crc kubenswrapper[4992]: I0131 10:46:15.762611 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerStarted","Data":"8c123a540674c584df0774db8728e3399bfc98b9e356cbc394aa6101cd919d5c"} Jan 31 10:46:15 crc kubenswrapper[4992]: I0131 10:46:15.762632 4992 scope.go:117] "RemoveContainer" containerID="00a8ba259da647f2eb2c4d2f8223521b40ddadc7cbea93b6656ad3855af5450e" Jan 31 10:46:20 crc kubenswrapper[4992]: I0131 10:46:20.486779 4992 scope.go:117] "RemoveContainer" containerID="843bbbb3f396e273fa8fb8463c794196fd8c1a3ce99fcc8352815e010837daff" Jan 31 10:46:25 crc kubenswrapper[4992]: I0131 10:46:25.889972 4992 generic.go:334] "Generic (PLEG): container finished" podID="5207e946-075f-4090-9596-ba1db236cb90" containerID="d28f4d5021be4befe3f55d259ffdc4ed5d31c537b49f43223c0af04bd20dcd38" exitCode=0 Jan 31 10:46:25 crc kubenswrapper[4992]: I0131 10:46:25.890094 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4695/must-gather-lchbd" event={"ID":"5207e946-075f-4090-9596-ba1db236cb90","Type":"ContainerDied","Data":"d28f4d5021be4befe3f55d259ffdc4ed5d31c537b49f43223c0af04bd20dcd38"} Jan 31 10:46:25 crc kubenswrapper[4992]: I0131 10:46:25.891458 4992 scope.go:117] "RemoveContainer" containerID="d28f4d5021be4befe3f55d259ffdc4ed5d31c537b49f43223c0af04bd20dcd38" Jan 31 10:46:25 crc kubenswrapper[4992]: I0131 10:46:25.974226 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z4695_must-gather-lchbd_5207e946-075f-4090-9596-ba1db236cb90/gather/0.log" Jan 31 10:46:33 crc kubenswrapper[4992]: I0131 10:46:33.405277 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-z4695/must-gather-lchbd"] Jan 31 10:46:33 crc kubenswrapper[4992]: I0131 10:46:33.406165 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-z4695/must-gather-lchbd" podUID="5207e946-075f-4090-9596-ba1db236cb90" containerName="copy" containerID="cri-o://4711d7b80a05aebf938de7f0ef1adb570ca64a6fe390fceef42a9d33f41d6c90" gracePeriod=2 Jan 31 10:46:33 crc kubenswrapper[4992]: I0131 10:46:33.416274 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-z4695/must-gather-lchbd"] Jan 31 10:46:33 crc kubenswrapper[4992]: I0131 10:46:33.986188 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z4695_must-gather-lchbd_5207e946-075f-4090-9596-ba1db236cb90/copy/0.log" Jan 31 10:46:33 crc kubenswrapper[4992]: I0131 10:46:33.987020 4992 generic.go:334] "Generic (PLEG): container finished" podID="5207e946-075f-4090-9596-ba1db236cb90" containerID="4711d7b80a05aebf938de7f0ef1adb570ca64a6fe390fceef42a9d33f41d6c90" exitCode=143 Jan 31 10:46:34 crc kubenswrapper[4992]: I0131 10:46:34.146075 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z4695_must-gather-lchbd_5207e946-075f-4090-9596-ba1db236cb90/copy/0.log" Jan 31 10:46:34 crc kubenswrapper[4992]: I0131 10:46:34.146520 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4695/must-gather-lchbd" Jan 31 10:46:34 crc kubenswrapper[4992]: I0131 10:46:34.289098 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5207e946-075f-4090-9596-ba1db236cb90-must-gather-output\") pod \"5207e946-075f-4090-9596-ba1db236cb90\" (UID: \"5207e946-075f-4090-9596-ba1db236cb90\") " Jan 31 10:46:34 crc kubenswrapper[4992]: I0131 10:46:34.289244 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqvz7\" (UniqueName: \"kubernetes.io/projected/5207e946-075f-4090-9596-ba1db236cb90-kube-api-access-jqvz7\") pod \"5207e946-075f-4090-9596-ba1db236cb90\" (UID: \"5207e946-075f-4090-9596-ba1db236cb90\") " Jan 31 10:46:34 crc kubenswrapper[4992]: I0131 10:46:34.301348 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5207e946-075f-4090-9596-ba1db236cb90-kube-api-access-jqvz7" (OuterVolumeSpecName: "kube-api-access-jqvz7") pod "5207e946-075f-4090-9596-ba1db236cb90" (UID: "5207e946-075f-4090-9596-ba1db236cb90"). InnerVolumeSpecName "kube-api-access-jqvz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:46:34 crc kubenswrapper[4992]: I0131 10:46:34.392353 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqvz7\" (UniqueName: \"kubernetes.io/projected/5207e946-075f-4090-9596-ba1db236cb90-kube-api-access-jqvz7\") on node \"crc\" DevicePath \"\"" Jan 31 10:46:34 crc kubenswrapper[4992]: I0131 10:46:34.484292 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5207e946-075f-4090-9596-ba1db236cb90-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5207e946-075f-4090-9596-ba1db236cb90" (UID: "5207e946-075f-4090-9596-ba1db236cb90"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:46:34 crc kubenswrapper[4992]: I0131 10:46:34.495209 4992 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5207e946-075f-4090-9596-ba1db236cb90-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 31 10:46:34 crc kubenswrapper[4992]: I0131 10:46:34.999407 4992 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z4695_must-gather-lchbd_5207e946-075f-4090-9596-ba1db236cb90/copy/0.log" Jan 31 10:46:35 crc kubenswrapper[4992]: I0131 10:46:35.002688 4992 scope.go:117] "RemoveContainer" containerID="4711d7b80a05aebf938de7f0ef1adb570ca64a6fe390fceef42a9d33f41d6c90" Jan 31 10:46:35 crc kubenswrapper[4992]: I0131 10:46:35.002759 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4695/must-gather-lchbd" Jan 31 10:46:35 crc kubenswrapper[4992]: I0131 10:46:35.044125 4992 scope.go:117] "RemoveContainer" containerID="d28f4d5021be4befe3f55d259ffdc4ed5d31c537b49f43223c0af04bd20dcd38" Jan 31 10:46:35 crc kubenswrapper[4992]: I0131 10:46:35.194506 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5207e946-075f-4090-9596-ba1db236cb90" path="/var/lib/kubelet/pods/5207e946-075f-4090-9596-ba1db236cb90/volumes" Jan 31 10:46:38 crc kubenswrapper[4992]: I0131 10:46:38.827112 4992 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5n2wx"] Jan 31 10:46:38 crc kubenswrapper[4992]: E0131 10:46:38.828168 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a89bdcc-20b7-4fa8-bba3-8c69f5af8996" containerName="collect-profiles" Jan 31 10:46:38 crc kubenswrapper[4992]: I0131 10:46:38.828188 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a89bdcc-20b7-4fa8-bba3-8c69f5af8996" containerName="collect-profiles" Jan 31 10:46:38 crc kubenswrapper[4992]: E0131 10:46:38.828225 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5207e946-075f-4090-9596-ba1db236cb90" containerName="gather" Jan 31 10:46:38 crc kubenswrapper[4992]: I0131 10:46:38.828234 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5207e946-075f-4090-9596-ba1db236cb90" containerName="gather" Jan 31 10:46:38 crc kubenswrapper[4992]: E0131 10:46:38.828253 4992 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5207e946-075f-4090-9596-ba1db236cb90" containerName="copy" Jan 31 10:46:38 crc kubenswrapper[4992]: I0131 10:46:38.828262 4992 state_mem.go:107] "Deleted CPUSet assignment" podUID="5207e946-075f-4090-9596-ba1db236cb90" containerName="copy" Jan 31 10:46:38 crc kubenswrapper[4992]: I0131 10:46:38.828554 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a89bdcc-20b7-4fa8-bba3-8c69f5af8996" containerName="collect-profiles" Jan 31 10:46:38 crc kubenswrapper[4992]: I0131 10:46:38.828579 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="5207e946-075f-4090-9596-ba1db236cb90" containerName="gather" Jan 31 10:46:38 crc kubenswrapper[4992]: I0131 10:46:38.828589 4992 memory_manager.go:354] "RemoveStaleState removing state" podUID="5207e946-075f-4090-9596-ba1db236cb90" containerName="copy" Jan 31 10:46:38 crc kubenswrapper[4992]: I0131 10:46:38.830383 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:46:38 crc kubenswrapper[4992]: I0131 10:46:38.851110 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5n2wx"] Jan 31 10:46:38 crc kubenswrapper[4992]: I0131 10:46:38.994587 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/714dea84-a3cf-41e0-a865-03dcd945117d-catalog-content\") pod \"redhat-operators-5n2wx\" (UID: \"714dea84-a3cf-41e0-a865-03dcd945117d\") " pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:46:38 crc kubenswrapper[4992]: I0131 10:46:38.994718 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scdzh\" (UniqueName: \"kubernetes.io/projected/714dea84-a3cf-41e0-a865-03dcd945117d-kube-api-access-scdzh\") pod \"redhat-operators-5n2wx\" (UID: \"714dea84-a3cf-41e0-a865-03dcd945117d\") " pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:46:38 crc kubenswrapper[4992]: I0131 10:46:38.994788 4992 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/714dea84-a3cf-41e0-a865-03dcd945117d-utilities\") pod \"redhat-operators-5n2wx\" (UID: \"714dea84-a3cf-41e0-a865-03dcd945117d\") " pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:46:39 crc kubenswrapper[4992]: I0131 10:46:39.096354 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/714dea84-a3cf-41e0-a865-03dcd945117d-catalog-content\") pod \"redhat-operators-5n2wx\" (UID: \"714dea84-a3cf-41e0-a865-03dcd945117d\") " pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:46:39 crc kubenswrapper[4992]: I0131 10:46:39.096483 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scdzh\" (UniqueName: \"kubernetes.io/projected/714dea84-a3cf-41e0-a865-03dcd945117d-kube-api-access-scdzh\") pod \"redhat-operators-5n2wx\" (UID: \"714dea84-a3cf-41e0-a865-03dcd945117d\") " pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:46:39 crc kubenswrapper[4992]: I0131 10:46:39.096540 4992 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/714dea84-a3cf-41e0-a865-03dcd945117d-utilities\") pod \"redhat-operators-5n2wx\" (UID: \"714dea84-a3cf-41e0-a865-03dcd945117d\") " pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:46:39 crc kubenswrapper[4992]: I0131 10:46:39.096945 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/714dea84-a3cf-41e0-a865-03dcd945117d-utilities\") pod \"redhat-operators-5n2wx\" (UID: \"714dea84-a3cf-41e0-a865-03dcd945117d\") " pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:46:39 crc kubenswrapper[4992]: I0131 10:46:39.097043 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/714dea84-a3cf-41e0-a865-03dcd945117d-catalog-content\") pod \"redhat-operators-5n2wx\" (UID: \"714dea84-a3cf-41e0-a865-03dcd945117d\") " pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:46:39 crc kubenswrapper[4992]: I0131 10:46:39.115999 4992 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scdzh\" (UniqueName: \"kubernetes.io/projected/714dea84-a3cf-41e0-a865-03dcd945117d-kube-api-access-scdzh\") pod \"redhat-operators-5n2wx\" (UID: \"714dea84-a3cf-41e0-a865-03dcd945117d\") " pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:46:39 crc kubenswrapper[4992]: I0131 10:46:39.154535 4992 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:46:39 crc kubenswrapper[4992]: I0131 10:46:39.617023 4992 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5n2wx"] Jan 31 10:46:40 crc kubenswrapper[4992]: I0131 10:46:40.050378 4992 generic.go:334] "Generic (PLEG): container finished" podID="714dea84-a3cf-41e0-a865-03dcd945117d" containerID="08c52d7f8cf9b2559d357b71b6e68136bd6fb3bdd13786ca4c4fa411cff123ac" exitCode=0 Jan 31 10:46:40 crc kubenswrapper[4992]: I0131 10:46:40.050457 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5n2wx" event={"ID":"714dea84-a3cf-41e0-a865-03dcd945117d","Type":"ContainerDied","Data":"08c52d7f8cf9b2559d357b71b6e68136bd6fb3bdd13786ca4c4fa411cff123ac"} Jan 31 10:46:40 crc kubenswrapper[4992]: I0131 10:46:40.050486 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5n2wx" event={"ID":"714dea84-a3cf-41e0-a865-03dcd945117d","Type":"ContainerStarted","Data":"fffca16d89178ea81aeed325c429e88b50ea9c54734f77ff143afe4fc7dfbe06"} Jan 31 10:46:41 crc kubenswrapper[4992]: I0131 10:46:41.063357 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5n2wx" event={"ID":"714dea84-a3cf-41e0-a865-03dcd945117d","Type":"ContainerStarted","Data":"17194f34bdaaa9bc185797c7b0c10e4021c618d8378234b308993c2f96fcb936"} Jan 31 10:46:45 crc kubenswrapper[4992]: I0131 10:46:45.099619 4992 generic.go:334] "Generic (PLEG): container finished" podID="714dea84-a3cf-41e0-a865-03dcd945117d" containerID="17194f34bdaaa9bc185797c7b0c10e4021c618d8378234b308993c2f96fcb936" exitCode=0 Jan 31 10:46:45 crc kubenswrapper[4992]: I0131 10:46:45.099696 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5n2wx" event={"ID":"714dea84-a3cf-41e0-a865-03dcd945117d","Type":"ContainerDied","Data":"17194f34bdaaa9bc185797c7b0c10e4021c618d8378234b308993c2f96fcb936"} Jan 31 10:46:46 crc kubenswrapper[4992]: I0131 10:46:46.111366 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5n2wx" event={"ID":"714dea84-a3cf-41e0-a865-03dcd945117d","Type":"ContainerStarted","Data":"ef209d0fe0dca648d9b857efe119b0a02910df4249510ad4379bf7c8ddec0e74"} Jan 31 10:46:46 crc kubenswrapper[4992]: I0131 10:46:46.139151 4992 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5n2wx" podStartSLOduration=2.609830048 podStartE2EDuration="8.139131119s" podCreationTimestamp="2026-01-31 10:46:38 +0000 UTC" firstStartedPulling="2026-01-31 10:46:40.053935184 +0000 UTC m=+4896.025327181" lastFinishedPulling="2026-01-31 10:46:45.583236245 +0000 UTC m=+4901.554628252" observedRunningTime="2026-01-31 10:46:46.131070899 +0000 UTC m=+4902.102462906" watchObservedRunningTime="2026-01-31 10:46:46.139131119 +0000 UTC m=+4902.110523106" Jan 31 10:46:49 crc kubenswrapper[4992]: I0131 10:46:49.155663 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:46:49 crc kubenswrapper[4992]: I0131 10:46:49.156264 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:46:50 crc kubenswrapper[4992]: I0131 10:46:50.211011 4992 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5n2wx" podUID="714dea84-a3cf-41e0-a865-03dcd945117d" containerName="registry-server" probeResult="failure" output=< Jan 31 10:46:50 crc kubenswrapper[4992]: timeout: failed to connect service ":50051" within 1s Jan 31 10:46:50 crc kubenswrapper[4992]: > Jan 31 10:46:59 crc kubenswrapper[4992]: I0131 10:46:59.215021 4992 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:46:59 crc kubenswrapper[4992]: I0131 10:46:59.284023 4992 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:46:59 crc kubenswrapper[4992]: I0131 10:46:59.472361 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5n2wx"] Jan 31 10:47:00 crc kubenswrapper[4992]: I0131 10:47:00.268142 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5n2wx" podUID="714dea84-a3cf-41e0-a865-03dcd945117d" containerName="registry-server" containerID="cri-o://ef209d0fe0dca648d9b857efe119b0a02910df4249510ad4379bf7c8ddec0e74" gracePeriod=2 Jan 31 10:47:00 crc kubenswrapper[4992]: I0131 10:47:00.728373 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:47:00 crc kubenswrapper[4992]: I0131 10:47:00.773207 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/714dea84-a3cf-41e0-a865-03dcd945117d-utilities\") pod \"714dea84-a3cf-41e0-a865-03dcd945117d\" (UID: \"714dea84-a3cf-41e0-a865-03dcd945117d\") " Jan 31 10:47:00 crc kubenswrapper[4992]: I0131 10:47:00.773311 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scdzh\" (UniqueName: \"kubernetes.io/projected/714dea84-a3cf-41e0-a865-03dcd945117d-kube-api-access-scdzh\") pod \"714dea84-a3cf-41e0-a865-03dcd945117d\" (UID: \"714dea84-a3cf-41e0-a865-03dcd945117d\") " Jan 31 10:47:00 crc kubenswrapper[4992]: I0131 10:47:00.773373 4992 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/714dea84-a3cf-41e0-a865-03dcd945117d-catalog-content\") pod \"714dea84-a3cf-41e0-a865-03dcd945117d\" (UID: \"714dea84-a3cf-41e0-a865-03dcd945117d\") " Jan 31 10:47:00 crc kubenswrapper[4992]: I0131 10:47:00.774412 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714dea84-a3cf-41e0-a865-03dcd945117d-utilities" (OuterVolumeSpecName: "utilities") pod "714dea84-a3cf-41e0-a865-03dcd945117d" (UID: "714dea84-a3cf-41e0-a865-03dcd945117d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:47:00 crc kubenswrapper[4992]: I0131 10:47:00.787603 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/714dea84-a3cf-41e0-a865-03dcd945117d-kube-api-access-scdzh" (OuterVolumeSpecName: "kube-api-access-scdzh") pod "714dea84-a3cf-41e0-a865-03dcd945117d" (UID: "714dea84-a3cf-41e0-a865-03dcd945117d"). InnerVolumeSpecName "kube-api-access-scdzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 10:47:00 crc kubenswrapper[4992]: I0131 10:47:00.876516 4992 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/714dea84-a3cf-41e0-a865-03dcd945117d-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 10:47:00 crc kubenswrapper[4992]: I0131 10:47:00.876558 4992 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scdzh\" (UniqueName: \"kubernetes.io/projected/714dea84-a3cf-41e0-a865-03dcd945117d-kube-api-access-scdzh\") on node \"crc\" DevicePath \"\"" Jan 31 10:47:00 crc kubenswrapper[4992]: I0131 10:47:00.889012 4992 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714dea84-a3cf-41e0-a865-03dcd945117d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "714dea84-a3cf-41e0-a865-03dcd945117d" (UID: "714dea84-a3cf-41e0-a865-03dcd945117d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 10:47:00 crc kubenswrapper[4992]: I0131 10:47:00.978306 4992 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/714dea84-a3cf-41e0-a865-03dcd945117d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 10:47:01 crc kubenswrapper[4992]: I0131 10:47:01.286817 4992 generic.go:334] "Generic (PLEG): container finished" podID="714dea84-a3cf-41e0-a865-03dcd945117d" containerID="ef209d0fe0dca648d9b857efe119b0a02910df4249510ad4379bf7c8ddec0e74" exitCode=0 Jan 31 10:47:01 crc kubenswrapper[4992]: I0131 10:47:01.286859 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5n2wx" event={"ID":"714dea84-a3cf-41e0-a865-03dcd945117d","Type":"ContainerDied","Data":"ef209d0fe0dca648d9b857efe119b0a02910df4249510ad4379bf7c8ddec0e74"} Jan 31 10:47:01 crc kubenswrapper[4992]: I0131 10:47:01.286906 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5n2wx" event={"ID":"714dea84-a3cf-41e0-a865-03dcd945117d","Type":"ContainerDied","Data":"fffca16d89178ea81aeed325c429e88b50ea9c54734f77ff143afe4fc7dfbe06"} Jan 31 10:47:01 crc kubenswrapper[4992]: I0131 10:47:01.286938 4992 scope.go:117] "RemoveContainer" containerID="ef209d0fe0dca648d9b857efe119b0a02910df4249510ad4379bf7c8ddec0e74" Jan 31 10:47:01 crc kubenswrapper[4992]: I0131 10:47:01.287107 4992 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5n2wx" Jan 31 10:47:01 crc kubenswrapper[4992]: I0131 10:47:01.314227 4992 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5n2wx"] Jan 31 10:47:01 crc kubenswrapper[4992]: I0131 10:47:01.320592 4992 scope.go:117] "RemoveContainer" containerID="17194f34bdaaa9bc185797c7b0c10e4021c618d8378234b308993c2f96fcb936" Jan 31 10:47:01 crc kubenswrapper[4992]: I0131 10:47:01.321451 4992 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5n2wx"] Jan 31 10:47:01 crc kubenswrapper[4992]: I0131 10:47:01.342446 4992 scope.go:117] "RemoveContainer" containerID="08c52d7f8cf9b2559d357b71b6e68136bd6fb3bdd13786ca4c4fa411cff123ac" Jan 31 10:47:01 crc kubenswrapper[4992]: I0131 10:47:01.380097 4992 scope.go:117] "RemoveContainer" containerID="ef209d0fe0dca648d9b857efe119b0a02910df4249510ad4379bf7c8ddec0e74" Jan 31 10:47:01 crc kubenswrapper[4992]: E0131 10:47:01.380508 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef209d0fe0dca648d9b857efe119b0a02910df4249510ad4379bf7c8ddec0e74\": container with ID starting with ef209d0fe0dca648d9b857efe119b0a02910df4249510ad4379bf7c8ddec0e74 not found: ID does not exist" containerID="ef209d0fe0dca648d9b857efe119b0a02910df4249510ad4379bf7c8ddec0e74" Jan 31 10:47:01 crc kubenswrapper[4992]: I0131 10:47:01.380536 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef209d0fe0dca648d9b857efe119b0a02910df4249510ad4379bf7c8ddec0e74"} err="failed to get container status \"ef209d0fe0dca648d9b857efe119b0a02910df4249510ad4379bf7c8ddec0e74\": rpc error: code = NotFound desc = could not find container \"ef209d0fe0dca648d9b857efe119b0a02910df4249510ad4379bf7c8ddec0e74\": container with ID starting with ef209d0fe0dca648d9b857efe119b0a02910df4249510ad4379bf7c8ddec0e74 not found: ID does not exist" Jan 31 10:47:01 crc kubenswrapper[4992]: I0131 10:47:01.380555 4992 scope.go:117] "RemoveContainer" containerID="17194f34bdaaa9bc185797c7b0c10e4021c618d8378234b308993c2f96fcb936" Jan 31 10:47:01 crc kubenswrapper[4992]: E0131 10:47:01.380748 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17194f34bdaaa9bc185797c7b0c10e4021c618d8378234b308993c2f96fcb936\": container with ID starting with 17194f34bdaaa9bc185797c7b0c10e4021c618d8378234b308993c2f96fcb936 not found: ID does not exist" containerID="17194f34bdaaa9bc185797c7b0c10e4021c618d8378234b308993c2f96fcb936" Jan 31 10:47:01 crc kubenswrapper[4992]: I0131 10:47:01.380770 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17194f34bdaaa9bc185797c7b0c10e4021c618d8378234b308993c2f96fcb936"} err="failed to get container status \"17194f34bdaaa9bc185797c7b0c10e4021c618d8378234b308993c2f96fcb936\": rpc error: code = NotFound desc = could not find container \"17194f34bdaaa9bc185797c7b0c10e4021c618d8378234b308993c2f96fcb936\": container with ID starting with 17194f34bdaaa9bc185797c7b0c10e4021c618d8378234b308993c2f96fcb936 not found: ID does not exist" Jan 31 10:47:01 crc kubenswrapper[4992]: I0131 10:47:01.380782 4992 scope.go:117] "RemoveContainer" containerID="08c52d7f8cf9b2559d357b71b6e68136bd6fb3bdd13786ca4c4fa411cff123ac" Jan 31 10:47:01 crc kubenswrapper[4992]: E0131 10:47:01.381032 4992 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08c52d7f8cf9b2559d357b71b6e68136bd6fb3bdd13786ca4c4fa411cff123ac\": container with ID starting with 08c52d7f8cf9b2559d357b71b6e68136bd6fb3bdd13786ca4c4fa411cff123ac not found: ID does not exist" containerID="08c52d7f8cf9b2559d357b71b6e68136bd6fb3bdd13786ca4c4fa411cff123ac" Jan 31 10:47:01 crc kubenswrapper[4992]: I0131 10:47:01.381085 4992 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08c52d7f8cf9b2559d357b71b6e68136bd6fb3bdd13786ca4c4fa411cff123ac"} err="failed to get container status \"08c52d7f8cf9b2559d357b71b6e68136bd6fb3bdd13786ca4c4fa411cff123ac\": rpc error: code = NotFound desc = could not find container \"08c52d7f8cf9b2559d357b71b6e68136bd6fb3bdd13786ca4c4fa411cff123ac\": container with ID starting with 08c52d7f8cf9b2559d357b71b6e68136bd6fb3bdd13786ca4c4fa411cff123ac not found: ID does not exist" Jan 31 10:47:03 crc kubenswrapper[4992]: I0131 10:47:03.196388 4992 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="714dea84-a3cf-41e0-a865-03dcd945117d" path="/var/lib/kubelet/pods/714dea84-a3cf-41e0-a865-03dcd945117d/volumes" Jan 31 10:47:20 crc kubenswrapper[4992]: E0131 10:47:20.182803 4992 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Jan 31 10:48:15 crc kubenswrapper[4992]: I0131 10:48:15.302327 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:48:15 crc kubenswrapper[4992]: I0131 10:48:15.303296 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:48:22 crc kubenswrapper[4992]: E0131 10:48:22.185291 4992 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Jan 31 10:48:45 crc kubenswrapper[4992]: I0131 10:48:45.301655 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:48:45 crc kubenswrapper[4992]: I0131 10:48:45.302188 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:49:15 crc kubenswrapper[4992]: I0131 10:49:15.301965 4992 patch_prober.go:28] interesting pod/machine-config-daemon-v7wks container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 10:49:15 crc kubenswrapper[4992]: I0131 10:49:15.302756 4992 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 10:49:15 crc kubenswrapper[4992]: I0131 10:49:15.302843 4992 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" Jan 31 10:49:15 crc kubenswrapper[4992]: I0131 10:49:15.304290 4992 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c123a540674c584df0774db8728e3399bfc98b9e356cbc394aa6101cd919d5c"} pod="openshift-machine-config-operator/machine-config-daemon-v7wks" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 10:49:15 crc kubenswrapper[4992]: I0131 10:49:15.304474 4992 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" containerName="machine-config-daemon" containerID="cri-o://8c123a540674c584df0774db8728e3399bfc98b9e356cbc394aa6101cd919d5c" gracePeriod=600 Jan 31 10:49:15 crc kubenswrapper[4992]: E0131 10:49:15.439085 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:49:15 crc kubenswrapper[4992]: I0131 10:49:15.846240 4992 generic.go:334] "Generic (PLEG): container finished" podID="28d252d5-9d5b-422f-baee-f350df5664b6" containerID="8c123a540674c584df0774db8728e3399bfc98b9e356cbc394aa6101cd919d5c" exitCode=0 Jan 31 10:49:15 crc kubenswrapper[4992]: I0131 10:49:15.846360 4992 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" event={"ID":"28d252d5-9d5b-422f-baee-f350df5664b6","Type":"ContainerDied","Data":"8c123a540674c584df0774db8728e3399bfc98b9e356cbc394aa6101cd919d5c"} Jan 31 10:49:15 crc kubenswrapper[4992]: I0131 10:49:15.846756 4992 scope.go:117] "RemoveContainer" containerID="6f7203532d581ffbcf617bfe7c402ecf702d9362eef64405ec48fcd09e2ed88d" Jan 31 10:49:15 crc kubenswrapper[4992]: I0131 10:49:15.847872 4992 scope.go:117] "RemoveContainer" containerID="8c123a540674c584df0774db8728e3399bfc98b9e356cbc394aa6101cd919d5c" Jan 31 10:49:15 crc kubenswrapper[4992]: E0131 10:49:15.848516 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:49:29 crc kubenswrapper[4992]: I0131 10:49:29.182979 4992 scope.go:117] "RemoveContainer" containerID="8c123a540674c584df0774db8728e3399bfc98b9e356cbc394aa6101cd919d5c" Jan 31 10:49:29 crc kubenswrapper[4992]: E0131 10:49:29.183731 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:49:41 crc kubenswrapper[4992]: I0131 10:49:41.186348 4992 scope.go:117] "RemoveContainer" containerID="8c123a540674c584df0774db8728e3399bfc98b9e356cbc394aa6101cd919d5c" Jan 31 10:49:41 crc kubenswrapper[4992]: E0131 10:49:41.195235 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6" Jan 31 10:49:49 crc kubenswrapper[4992]: E0131 10:49:49.182360 4992 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="test-operator-logs-pod-horizontest-horizontest-tests-horizontest" hostnameMaxLen=63 truncatedHostname="test-operator-logs-pod-horizontest-horizontest-tests-horizontes" Jan 31 10:49:53 crc kubenswrapper[4992]: I0131 10:49:53.183043 4992 scope.go:117] "RemoveContainer" containerID="8c123a540674c584df0774db8728e3399bfc98b9e356cbc394aa6101cd919d5c" Jan 31 10:49:53 crc kubenswrapper[4992]: E0131 10:49:53.185736 4992 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v7wks_openshift-machine-config-operator(28d252d5-9d5b-422f-baee-f350df5664b6)\"" pod="openshift-machine-config-operator/machine-config-daemon-v7wks" podUID="28d252d5-9d5b-422f-baee-f350df5664b6"